Skip to content

Commit

Permalink
improve docs
Browse files Browse the repository at this point in the history
  • Loading branch information
CarloLucibello committed Apr 7, 2022
1 parent 590a209 commit fb2d834
Show file tree
Hide file tree
Showing 4 changed files with 27 additions and 3 deletions.
1 change: 1 addition & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,7 @@ jobs:
using Documenter
using Documenter: doctest
DocMeta.setdocmeta!(Flux, :DocTestSetup, :(using Flux); recursive=true)
DocMeta.setdocmeta!(Flux.Losses, :DocTestFilters, :(r"[0-9\.]+f0"); recursive=true)
doctest(Flux)'
- run: julia --project=docs docs/make.jl
env:
Expand Down
1 change: 0 additions & 1 deletion docs/make.jl
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
using Documenter, Flux, NNlib, Functors, MLUtils

DocMeta.setdocmeta!(Flux, :DocTestSetup, :(using Flux); recursive = true)
DocMeta.setdocmeta!(Flux.Losses, :DocTestSetup, :(using Flux.Losses); recursive = true)

# In the Losses module, doctests which differ in the printed Float32 values won't fail
DocMeta.setdocmeta!(Flux.Losses, :DocTestFilters, :(r"[0-9\.]+f0"); recursive = true)
Expand Down
4 changes: 2 additions & 2 deletions docs/src/models/losses.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,8 +19,8 @@ loss(ŷ, y)

They are commonly passed as arrays of size `num_target_features x num_examples_in_batch`.

Most loss functions in Flux have an optional argument `agg`, denoting the type of aggregation performed over the
batch:
Most losses in Flux have an optional argument `agg` accepting a function to be used as
as a final aggregation:

```julia
loss(ŷ, y) # defaults to `mean`
Expand Down
24 changes: 24 additions & 0 deletions src/losses/functions.jl
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@ Return the loss corresponding to mean absolute error:
# Examples
```jldoctest
julia> using Flux.Losses: mae
julia> y_model = [1.1, 1.9, 3.1];
julia> mae(y_model, 1:3)
Expand All @@ -31,6 +33,8 @@ See also: [`mae`](@ref), [`msle`](@ref), [`crossentropy`](@ref).
# Examples
```jldoctest
julia> using Flux.Losses: mse
julia> y_model = [1.1, 1.9, 3.1];
julia> y_true = 1:3;
Expand All @@ -57,6 +61,8 @@ Penalizes an under-estimation more than an over-estimatation.
# Examples
```jldoctest
julia> using Flux.Losses: msle
julia> msle(Float32[1.1, 2.2, 3.3], 1:3)
0.009084041f0
Expand Down Expand Up @@ -113,6 +119,8 @@ of label smoothing to binary distributions encoded in a single number.
# Examples
```jldoctest
julia> using Flux.Losses: label_smoothing, crossentropy
julia> y = Flux.onehotbatch([1, 1, 1, 0, 1, 0], 0:1)
2×6 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
⋅ ⋅ ⋅ 1 ⋅ 1
Expand Down Expand Up @@ -179,6 +187,8 @@ See also: [`logitcrossentropy`](@ref), [`binarycrossentropy`](@ref), [`logitbina
# Examples
```jldoctest
julia> using Flux.Losses: label_smoothing, crossentropy
julia> y_label = Flux.onehotbatch([0, 1, 2, 1, 0], 0:2)
3×5 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
1 ⋅ ⋅ ⋅ 1
Expand Down Expand Up @@ -232,6 +242,8 @@ See also: [`binarycrossentropy`](@ref), [`logitbinarycrossentropy`](@ref), [`lab
# Examples
```jldoctest
julia> using Flux.Losses: crossentropy, logitcrossentropy
julia> y_label = onehotbatch(collect("abcabaa"), 'a':'c')
3×7 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
1 ⋅ ⋅ 1 ⋅ 1 1
Expand Down Expand Up @@ -273,7 +285,10 @@ computing the loss.
See also: [`crossentropy`](@ref), [`logitcrossentropy`](@ref).
# Examples
```jldoctest
julia> using Flux.Losses: binarycrossentropy, crossentropy
julia> y_bin = Bool[1,0,1]
3-element Vector{Bool}:
1
Expand Down Expand Up @@ -314,7 +329,10 @@ Mathematically equivalent to
See also: [`crossentropy`](@ref), [`logitcrossentropy`](@ref).
# Examples
```jldoctest
julia> using Flux.Losses: binarycrossentropy, logitbinarycrossentropy
julia> y_bin = Bool[1,0,1];
julia> y_model = Float32[2, -1, pi]
Expand Down Expand Up @@ -348,6 +366,8 @@ from the other. It is always non-negative, and zero only when both the distribut
# Examples
```jldoctest
julia> using Flux.Losses: kldivergence
julia> p1 = [1 0; 0 1]
2×2 Matrix{Int64}:
1 0
Expand Down Expand Up @@ -467,6 +487,8 @@ For `γ == 0`, the loss is mathematically equivalent to [`binarycrossentropy`](@
# Examples
```jldoctest
julia> using Flux.Losses: binary_focal_loss
julia> y = [0 1 0
1 0 1]
2×3 Matrix{Int64}:
Expand Down Expand Up @@ -509,6 +531,8 @@ For `γ == 0`, the loss is mathematically equivalent to [`crossentropy`](@ref).
# Examples
```jldoctest
julia> using Flux.Losses: focal_loss
julia> y = [1 0 0 0 1
0 1 0 1 0
0 0 1 0 0]
Expand Down

0 comments on commit fb2d834

Please sign in to comment.