A common situation is that you have an image that can represent more than one class, e.g. a image may both have an oil tanker and an oil platform. You also may have missing data for some of these that you don’t want to evaluate. In my research this problem occurs and my solution so far has been a my own [criterion_ignore](https://github.com/gforge/criterion_ignore) that sets the errors for ignore labels to zero. This post will be a quick look at how to combine the torch-dataframe with the criterion_ignore.
# All posts in the *torch-dataframe* series
1. [Intro to the torch-dataframe][intro]
2. [Modifications][mods]
3. [Subsetting][subs]
4. [The mnist example][mnist ex]
5. [Multilabel classification][multilabel]
[intro]: http://gforge.se/2016/08/deep-learning-with-torch-dataframe-a-gentle-introduction-to-torch/
[mods]: http://gforge.se/2016/08/the-torch-dataframe-basics-on-modifications/
[subs]: http://gforge.se/2016/08/the-torch-dataframe-subsetting-and-sampling/
[mnist ex]: http://gforge.se/2016/08/integration-between-torchnet-and-torch-dataframe-a-closer-look-at-the-mnist-example/
[multilabel]: http://gforge.se/2016/08/setting-up-a-multilabel-classification-network-with-torch-dataframe/
# Install criterion_ignore
I haven’t yet uploaded a rockspec so for now you need to install directly via GitHub:
luarocks install https://raw.githubusercontent.com/gforge/criterion_ignore/master/rocks/criterion_ignore-0.2-1.rockspec
# Preparing the torch-dataframe
First we start with loading the data and in my case all the columns except the filename where I have my image needs to be converted to categorical values:
“`lua
require ‘Dataframe’
dataset = Dataframe(data_path)
for _,key in pairs(dataset.columns) do
if (key ~= “Filename”) then
dataset:as_categorical(key)
end
end
“`
As the criterin_ignore skips labels that are equal to zero we need to replace the `nan` values with 0:
“`lua
dataset:fill_all_na() — uses 0 by default
“`
Next we get the maximum integer value for each output in order to find out the number of neurons that we’ll need for each output:
“`lua
local no_outputs = dataset:get_max_value{with_named_keys = true}
“`
# Setting up the criterion_ignore
Now we’ll do the interesting and add a `nn.Linear` with the number of neurons indicated in a [`nn.ConcatTable`](https://github.com/torch/nn/blob/master/doc/table.md#nn.ConcatTable) structure (basically copies the previous layer into the top layers that are stacked in parallel). Each criterion is then added to the criterion_ignore with a 0:
“`lua
— Add the different outputs and the criterion
criterion = criterion_ignore.Parallel.new()
local prl = nn.ConcatTable()
local output_count = 0
for i=1,#dataset.column_order do
local column_name = dataset.column_order[i]
if (no_outputs[column_name] ~= nil) then
output_count = output_count + 1
print(“Adding output no ” .. output_count ..
” for ” .. column_name .. ” containing ” ..
no_outputs[column_name] .. ” neurons”)
— Add the network layer
local seq = nn.Sequential()
seq:add(nn.Linear(4096, no_outputs[column_name]))
seq:add(nn.LogSoftMax())
prl:add(seq)
— Add the criterion for the layer
criterion:add{
criterion = nn.ClassNLLCriterion(),
ignore = 0}
end
end
criterion:cuda()
model:add(prl:cuda())
“`
# Summary
In this post I’ve covered the basics of how to use the *torch-dataframe* together with `nn.ConcatTable` for a multilabel classification task. There are most likely a ton of other methods available but I’ve found this to work fine. If you need more help in implementing then have a look at the mnist example.