Skip to content

Formatting and indentation fixes #1

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Oct 13, 2016
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
59 changes: 28 additions & 31 deletions Creating Extensions using FFI.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ First, you have to write your C functions.

Below you can find an example implementation of forward and backward functions of a module that adds its both inputs.

In your .c files you can include TH using an #include <TH/TH.h> directive, and THC using #include <THC/THC.h>.
In your `.c` files you can include TH using an `#include <TH/TH.h>` directive, and THC using `#include <THC/THC.h>`.

ffi utils will make sure a compiler can find them during the build.

Expand All @@ -17,18 +17,18 @@ ffi utils will make sure a compiler can find them during the build.
int my_lib_add_forward(THFloatTensor *input1, THFloatTensor *input2,
THFloatTensor *output)
{
if (!THFloatTensor_isSameSizeAs(input1, input2))
return 0;
THFloatTensor_resizeAs(output, input1);
THFloatTensor_add(output, input1, input2);
return 1;
if (!THFloatTensor_isSameSizeAs(input1, input2))
return 0;
THFloatTensor_resizeAs(output, input1);
THFloatTensor_add(output, input1, input2);
return 1;
}

int my_lib_add_backward(THFloatTensor *grad_output, THFloatTensor *grad_input)
{
THFloatTensor_resizeAs(grad_input, grad_output);
THFloatTensor_fill(grad_input, 1);
return 1;
THFloatTensor_resizeAs(grad_input, grad_output);
THFloatTensor_fill(grad_input, 1);
return 1;
}
```

Expand All @@ -39,8 +39,7 @@ It will be used by the ffi utils to generate appropriate wrappers.

```C
/* src/my_lib.h */
int my_lib_add_forward(THFloatTensor *input1, THFloatTensor *input2,
THFloatTensor *output);
int my_lib_add_forward(THFloatTensor *input1, THFloatTensor *input2, THFloatTensor *output);
int my_lib_add_backward(THFloatTensor *grad_output, THFloatTensor *grad_input);
```

Expand All @@ -59,7 +58,7 @@ with_cuda=False

## Step 2: Include it in your Python code

After you run it, pytorch will create an _ext directory and put my_lib inside.
After you run it, pytorch will create an `_ext` directory and put `my_lib` inside.

Package name can have an arbitrary number of packages preceding the final module name (including none).
If the build succeeded you can import your extension just like a regular python file.
Expand All @@ -72,16 +71,15 @@ from _ext import my_lib


class MyAddFunction(Function):

def forward(self, input1, input2):
output = torch.FloatTensor()
my_lib.my_lib_add_forward(input1, input2, output)
return output

def backward(self, grad_output):
grad_input = torch.FloatTensor()
my_lib.my_lib_add_backward(grad_output, grad_input)
return grad_input
def forward(self, input1, input2):
output = torch.FloatTensor()
my_lib.my_lib_add_forward(input1, input2, output)
return output

def backward(self, grad_output):
grad_input = torch.FloatTensor()
my_lib.my_lib_add_backward(grad_output, grad_input)
return grad_input
```

```python
Expand All @@ -90,9 +88,8 @@ from torch.nn import Module
from functions.add import MyAddFunction

class MyAddModule(Module):

def forward(self, input1, input2):
return MyAddFunction()(input1, input2)
def forward(self, input1, input2):
return MyAddFunction()(input1, input2)
```

```python
Expand All @@ -102,13 +99,13 @@ from torch.autograd import Variable
from modules.add import MyAddModule

class MyNetwork(nn.Container):
def __init__(self):
super(MyNetwork, self).__init__(
add=MyAddModule(),
)
def __init__(self):
super(MyNetwork, self).__init__(
add=MyAddModule(),
)

def forward(self, input1, input2):
return self.add(input1, input2)
def forward(self, input1, input2):
return self.add(input1, input2)

model = MyNetwork()
input1, input2 = Variable(torch.randn(5, 5)), Variable(torch.randn(5, 5))
Expand Down