-
-
Notifications
You must be signed in to change notification settings - Fork 612
@functor
is a bad name and nobody knows to use it
#1946
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
|
Is something like this more readable? @parameter Dense trainable=(:w, :b)
# or...
build!(Dense, trainable=(:w, :b)) |
Agree that "functor" is a pretty obscure name. Once upon a time it was Also agree that something which does The happiest situation would be not to need such a thing at all, for most layers. It could be something like:
I bet almost all models would just work with 1,2,3. It's a constraint that you have to satisfy 2, but not a difficult one -- and we aren't aiming to handle everything in the world, only things written to be used with Flux. |
In the short term, a convenience macro seems the path of least resistance. Longer term, we'll want to look at discussions like JuliaObjects/ConstructionBase.jl#54 to figure out what underlying infrastructure would be required for the "happiest situation". |
related to FluxML/Functors.jl#49 . It would be nice to assume every type "functorized" by default |
That would be great, but the tricky part about functorizing by default is that it turns the opt-in mechanism into an opt-out one. e.g. If someone defines a custom layer type that contains a ValueHistories or DataStructures type, Some of these cases could be avoided by allowing user/FluxML-owned parent types to exclude certain fields, but it's not hard to come up with a counterexample there either. Opting out for certain types via piracy is another option, but a suboptimal one at best. If we can come up with a satisfactory solution for this problem, then I don't see much else in the way of auto-functorization. |
Sensational title, but I've been seeing many recent examples of code overloading
trainable
instead. Given that there's no fundamental reason to call the macro which makes this struct a trainable layer "functor", We could at least alias it to something more intuitive. RFC on ideas?The text was updated successfully, but these errors were encountered: