Review: xformers 0.0.29.post2-1
Review Information
rejected — allocated to babelouest 1 month, 17 days ago, started 1 month, 17 days ago, completed 1 month, 12 days ago.
Final Comment
The following source files have original package license when they claim to be adapted from other sources:
xformers/benchmarks/LRA/code/dataset.py
xformers/benchmarks/LRA/setup/*
xformers/components/input_projection.py
===========================================
Missing xformers/csrc/attention/computeUtil.h
// taken from
// https://github.com/hgyhungry/ge-spmm/blob/master/pytorch-custom/computeUtil.h
// with minor modifications
===========================================
Missing xformers/csrc/attention/cpu/sddmm.cpp and xformers/csrc/attention/cpu/spmm.cpp
Copyright 2025 The Google Research Authors.
Apache License, Version 2.0
===========================================
Missing ./xformers/csrc/attention/hip_decoder
Advanced Micro Devices, Inc.
BSD-3-clause
===========================================
Missing xformers/csrc/attention/hip_fmha/instances/ck_tiled_rand_uniform_kernal.h
Advanced Micro Devices, Inc.
MIT
===========================================
Missing xformers/csrc/sparse24/gemm.cu
COPY-PASTE FROM PyTorch
PyTorch/Caffe2 authors
PyTorch license