Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Working with different base net #4

Open
Ram-Godavarthi opened this issue Aug 13, 2018 · 8 comments
Open

Working with different base net #4

Ram-Godavarthi opened this issue Aug 13, 2018 · 8 comments

Comments

@Ram-Godavarthi
Copy link

Ram-Godavarthi commented Aug 13, 2018

Can i use mobilenet as basenet instead of vgg..
If yes, Where exactly should i randomize the weights..
is it something here
and here , Shoul di change something?

Is there any other place, where i need to change it.
And, what should be done to config.py file to not to include param file..this .

Also, How did you write the names of vgg16 layers. fixed_params. config.py line 113
What are the names of the layers of mobilenetv2_0.25 model?

please try to help provide me some hints regarding this...
Thank You..

@WalterMa
Copy link
Owner

If you just want to change the base network from vgg to mobile net, simply create you feature extract layers and top feature extract layers like here.

The parameters initializer should be configured when you define layers, the two places you referred are used to invoke initializer your predefined.

@WalterMa
Copy link
Owner

fixed_params is the array of those layers params name you want to fix.
You could print net.collect_param() and copy names you want to fix.
then modify this fuction.

@Ram-Godavarthi
Copy link
Author

If i don't want to fix any params, How to remove that fix param things? I want to train it from the scratch.. Bcz my dataset is different... So..

Also i trained the network with mobilnet base.. But i am not getting map.. I am getting worst map as 0.003..

What could be the reason??
What exactly should i change here include mobilenet? please help me out in this.

I did something like .
I just removed flattened layer.
i have removed last 5 layers from base at the top.
for layer in base_model.features[:-5]:
and added 3 bottom layers in rcnnHEAD..
for layer in base_model.features[-3:]:
self.fc_layers.add(layer)
self.cls_score = nn.Dense(in_units=160, units=num_classes, weight_initializer=initializer.Normal(0.01))
self.bbox_pred = nn.Dense(in_units=160, units=num_classes * 4, weight_initializer=initializer.Normal(0.001))

Is it correct?
or should i do it in a different way?

@WalterMa
Copy link
Owner

If you don't want to fix any params, just comment this line:

fix_net_params(net, cfg.network)

For your second question, I'm not familiar with mobilenet.
Could not help more, sorry.
And maybe you could post you code in detail.

@Ram-Godavarthi
Copy link
Author

Okay. Thank You.. I will do that...
I will try to work on it first. then i will post it if something is not happening.

@Ram-Godavarthi
Copy link
Author

Ram-Godavarthi commented Aug 17, 2018

I have done something like this to include mobilenet as base network..

But i am getting Map as 0 even after 5 epochs.
I have randomized the weights.
I have not changed anything else from the source code...

Please help me out in getting some good map.. Let me know the mistake i have done in the network.

class MobileNet_mod(nn.HybridBlock):
    def __init__(self, base_model, multiplier=0.25, classes=3, **kwargs):
        super(MobileNet_mod, self).__init__(**kwargs)
        with self.name_scope():
            self.features = nn.HybridSequential(prefix='')
            for layer in base_model.features[:-35]:
                self.features.add(layer)

    def hybrid_forward(self, F, x, *args, **kwargs):
        x = self.features(x)
        #x = self.output(x)
        return x

class MObFastRCNNHead(HybridBlock):

    def __init__(self, base_model, num_classes, feature_stride, **kwargs):
        super(MObFastRCNNHead, self).__init__(**kwargs)
        self.feature_stride = feature_stride
        self.bottom = nn.HybridSequential()
        # Include last 2 mobilenet feature layers
        for layer in base_model.features[-2:]:
            self.bottom.add(layer)
        self.cls_score = nn.Dense(in_units=128, units=num_classes, weight_initializer=initializer.Normal(0.01))
        self.bbox_pred = nn.Dense(in_units=128, units=num_classes * 4, weight_initializer=initializer.Normal(0.001))

    def hybrid_forward(self, F, feature_map, rois):
        x = F.ROIPooling(data=feature_map, rois=rois, pooled_size=(3, 3), spatial_scale=1.0 / self.feature_stride)
       
        x = self.bottom(x)

        cls_score = self.cls_score(x)
        cls_prob = F.softmax(data=cls_score)  # shape(roi_num, num_classes)
        bbox_pred = self.bbox_pred(x)  # shape(roi_num, num_classes*4)
        return cls_prob, bbox_pred

@WalterMa
Copy link
Owner

What mobilenet you used and passed by base_model?
Also, check whether the pooled_size and feature_stride configurations is fit for your network.

And you could debug into the network, check outputs are correct or not.

@Ram-Godavarthi
Copy link
Author

I have passed mobilenet0.25 .. it is mobilenet V1 alpha 0.25
I have excluded last 11 layers in the feature extraction.
and used last 2 layers (avgpool2d and flatten) in rcnn part.
i have changed pooling size to (3,3)

Should i change anything else??
I have given stride number as 16 only, since my input data is of 512*512 as of now..

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants