Accessor method implementations (continued)#2307
Accessor method implementations (continued)#2307iamshnoo wants to merge 9 commits intomlpack:masterfrom iamshnoo:master
Conversation
…/mlpack into accessor_implementation
Accessor implementation
|
@iamshnoo Always keep the master branch for rebasing Now if you want to rebase your pr It will cause problems for you. Make sure, if you are making changes it should be on feature branch. |
|
@himanshupathak21061998 I am kind of new to the git workflow. I had never heard of rebasing before this. But thank you, I will keep this in mind for any new PRs I create. Unless of course, you are suggesting that I should do that for this PR itself? |
| //! Get the value of the includeKl parameter. | ||
| bool IncludeKL() const { return includeKl; } | ||
|
|
||
| //! Get the value of the beta hyperparameter. | ||
| double Beta() const { return beta; } |
There was a problem hiding this comment.
Nice detail with distinguishing parameter and hyperparameter :)
|
@iamshnoo could you add a simple test to check if the returned values are correct? You'll need to make changes to the file src/mlpack/tests/ann_layer_test.cpp. |
|
I am going to try some experiments with git now and in case I mess up, I will probably delete my fork and re-start. If that happens I might have to re-open this PR separately, so just a heads up about this! And @sreenikSS sure, I will write down those tests as soon as I can. Thanks |
Update forked master with upstream changes.
zoq
left a comment
There was a problem hiding this comment.
That should at least resolve the first error, maybe there others as well.
| OutputDataType& Gradient() { return gradient; } | ||
|
|
||
| //! Get the input width. | ||
| size_t const& InputWidth() const { return width; } |
There was a problem hiding this comment.
Some modules in mlpack only check for InputWidth and expect OutputWidth is there as well, either we remove the getter here or add a pseudo OutputWidth method as well.
I messed up. I am going to delete my fork now! |
As per the suggestions made in the last few comments in #2269 , I am opening this PR. As far as I can understand this only has the 7 commits with 22 files changed by me and the rest is same as the master branch of mlpack as of now. I hope I am doing this right :)
@zoq @sreenikSS If you guys find some time, could you please review the changes.
I basically did all the work in my feature branch, then updated the master branch of my fork today, and merged the feature branch to my master. Now, I am trying to merge this master with mlpack:master.
I am keeping this PR as draft for now, until it is reviewed.Never mind, I am switching from draft to review mode. I guess I just wanted to use the draft feature badly :)