Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BLAS/LAPACK calls need numeric traits #154

Open
timholy opened this issue Dec 29, 2014 · 9 comments
Open

BLAS/LAPACK calls need numeric traits #154

timholy opened this issue Dec 29, 2014 · 9 comments

Comments

@timholy
Copy link
Member

timholy commented Dec 29, 2014

If I wrap a numeric type, I can't use BLAS or LAPACK calls anymore. This hurts performance considerably:

module Elements

export MyEl

immutable MyEl{T}
    el::T
end

Base.zero{T}(::Type{MyEl{T}}) = MyEl{T}(zero(T))
Base.conj!{T<:Real}(A::Array{MyEl{T}}) = A
Base.promote_rule{T,S}(::Type{MyEl{T}}, ::Type{MyEl{S}}) = MyEl{promote_type{T,S}}
+{S,T}(a::MyEl{S}, b::MyEl{T}) = MyEl{promote_type(S,T)}(a.el+b.el)
*{S,T}(a::MyEl{S}, b::MyEl{T}) = MyEl{promote_type(S,T)}(a.el*b.el)

end

julia> using Elements

julia> A = rand(200,199);

julia> B = reinterpret(MyEl{Float64}, A);

# After warmup
julia> @time A'*A;
elapsed time: 0.002651306 seconds (316976 bytes allocated)

julia> @time B'*B;
elapsed time: 0.020106977 seconds (317216 bytes allocated)

It's using the generic_matmatmul! fallback, since a MyEl{Float64} is not recognized as a type that one can pass to BLAS/LAPACK.

It would seem that the best approach would be to define a set of numeric traits and have the BLAS/LAPACK routines use them. I'm submitting this issue to open discussion about how best to do this. Here's one proposal:

blastype{T}(::Type{T}) = T
blastype{T}(::Type{MyEl{T}}) = T  # defined by package author

some_blas_call(A::StridedArray) = some_blas_call(blastype(eltype(A)), A)
function some_blas_call{T<:BlasComplex}(::Type{T}, A)
    # the version that calls blas goes here
end
function some_blas_call{T}(::Type{T}, A)
    # the generic fallback goes here
end

[cjh: colorized]

@timholy
Copy link
Member Author

timholy commented Dec 29, 2014

Another option that Jeff suggested is to pair the type and the array in a tuple, e.g.,

some_blas_call(A::StridedArray) = some_blas_call((A,blastype(eltype(A))))
function some_blas_call{T<:BlasComplex}(Atype::(StridedArray,Type{T}))
    A = Atype[1]
    # the version that calls blas goes here
end
function some_blas_call{T}(Atype::(StridedArray,Type{T}))
    A = Atype[1]
    # the generic fallback goes here
end

This has the advantage that you don't have to worry about which type goes with which argument. I've been burned with this in creating start methods (it gets confused by the iteration methods defined for tuples, leading to weird bugs), but I suspect it should be safe for any linear algebra code (all of which loads after inference.jl).

@jiahao
Copy link
Member

jiahao commented Dec 29, 2014

A possible use case is unitful matrix computations, c.f. #128.

@Jutho
Copy link
Contributor

Jutho commented Dec 29, 2014

+1.
Maybe good to combine a blas overhaul also with an isstrided trait.

@timholy
Copy link
Member Author

timholy commented Dec 30, 2014

Agreed, might as well do both at once.

@andreasnoack
Copy link
Member

I like the purpose of this, but I think we should think about the structure of all of our matrix multiplication and solve code here. Maybe it can be simplified while adding this functionality.

Right now when multiplying two matrices A*B we have the following sequence of calls

  1. * allocates the output array and calls
  2. A_mul_B! which useful when you want to avoid allocation, but it simply calls
  3. gemm_wrapper! which does a lot of (possibly expensive) checks and calls
    1. matmul2x2 when the dimension is 2x2
    2. matmul3x3 when the dimension is 3x3
    3. BLAS.gemm! when the element types are the same and one the BlasFloat types
    4. generic_matmatmul! anything else

Having a version of BLAS.gemm! that calls generic_matmatmul! seems backward.

Also, so far I think the guideline has been that the A_mul_B! functions constitute the generic interface and BLAS.gemm! is not generic but only thin wrapper around the library. Adding traits to the BLAS methods introduces two layers of generic methods for matrix multiplication. Maybe we should just write generic version matching all BLAS functions instead of matmul.jl and add traits to these new BLAS like methods.

I have a branch that adds pointer version of the BLAS 2 and 3 wrappers. These might be useful for adding support for wrapped BLAS floats. I think it is done, but I got a little tired writing tests for the new methods, which is why it is not in master yet.

@timholy
Copy link
Member Author

timholy commented Dec 30, 2014

👍

@dlfivefifty
Copy link
Contributor

Ref: JuliaLang/julia#25558

Not sure how to get tackle this issue in that PR though.

@Jutho
Copy link
Contributor

Jutho commented Feb 28, 2018

@dlfivefifty , one way to encode the information that MyEl{T} actually behaves as T would be in the parameter in MemoryLayout{T}, which you've just removed in your latest commit, could serve for that purpose, i.e. the parameter in MemoryLayout{T} would not necessarily be the element type of the AbstractArray (because then it's duplicate information and that's why you've just removed it), but it could be something like, when possible, the equivalent standard built-in type to communicate with BLAS etc.

I don't think I do a good job in explaining and its probably not easy to implement, but it would be something that an AbstractArray{S} would have a MemoryLayout{T} where T = blastype(S), or some other name than blastype. New types like @timholy 's El{T} would have blastype(El{T}) = T.

@dlfivefifty
Copy link
Contributor

Yes that's not a bad idea... but I do prefer the update to MemoryLayout without the {T}for a variety of reasons (it's cleaner, and wasn't used anywhere in the dispatch).

It also doesn't solve the problem that BLAS.gemv! requires a BlasFloat to work.

@KristofferC KristofferC transferred this issue from JuliaLang/julia Nov 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants