The `haskell-gi` build fails its doctests because of a missing
library; I'm not 100% convinced that setting it to `dontCheck` is the
right thing to do, but I don't have a better idea at the moment.
The `gi-gdkx11` build fails because, surprise, Gdk-X11 isn't found; by
looking around in my store, I found that that that library seems to
live in gtk3 these days; this override is just a stop-gap, though,
I've also submitted the change to cabal2nix that I believe will fix
the automatic generation of the package in the future.
This adds KMyMoney, a finance manager for KDE plus a few required
dependencies.
I ran the upstream test suite as well as the following manual tests:
* Basic startup
* Completing the wizard
* Add some test transactions
* GPG encryption
* Generation of charts and reports
* Rough check whether OFX integration lists supported financial
institutions.
* Small check of AqBanking implementation, whether accounts and users
can be configured, but didn't test actual connectivity with a
financial institution.
* Check of Weboob integration with a test PayPal backend, however also
just with a dummy account and without actually connecting to PayPal.
On top of that, the application already is being used by the person
requesting me to package this, so I'd guess it works well enough.
I'm merging this without the review from @ttuegel because it only adds
packages and doesn't change anything fundamental about the KDE
ecosystem.
The only change here is to add C++ support to "mpir", where the
maintainer (@7c6f434c) has approved the change.
* pytorch-0.3 with optional cuda and cudnn
* pytorch tests reenabled if compiling without cuda
* pytorch: Conditionalize cudnn dependency on cudaSupport
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
* pytorch: Compile with the same GCC version used by CUDA if cudaSupport
Fixes this error:
In file included from /nix/store/gv7w3c71jg627cpcff04yi6kwzpzjyap-cudatoolkit-9.1.85.1/include/host_config.h:50:0,
from /nix/store/gv7w3c71jg627cpcff04yi6kwzpzjyap-cudatoolkit-9.1.85.1/include/cuda_runtime.h:78,
from <command-line>:0:
/nix/store/gv7w3c71jg627cpcff04yi6kwzpzjyap-cudatoolkit-9.1.85.1/include/crt/host_config.h:121:2: error: #error -- unsupported GNU version! gcc versions later than 6 are not supported!
#error -- unsupported GNU version! gcc versions later than 6 are not supported!
^~~~~
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
* pytorch: Build with joined cudatoolkit
Similar to #30058 for TensorFlow.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
* pytorch: 0.3.0 -> 0.3.1
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
* pytorch: Patch for “refcounted file mapping not supported” failure
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
* pytorch: Skip distributed tests
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
* pytorch: Use the stub libcuda.so from cudatoolkit for running tests
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
The latest update of `yowsup` (https://github.com/tgalal/yowsup/releases/tag/v2.5.7)
contains the following fixes:
* Updated tokens
* Fixedtgalal/yowsup#1842: Bug in protocol_groups RemoveGroupsNotificationProtocolEntity
* Other minor bug fixes
The `argparse-dependency.patch` required a rebase onto the latest
version of `setup.py` and ensures that `argparse` won't be needed as
extra dependency as our `python3` package ships `argparse` by default.
A short note to Python 2 support:
the actual issue related to Python 2.x support has been resolved
(https://github.com/tgalal/yowsup/issues/2325#issuecomment-354533727),
however this relies on `six==1.10` which isn't support by `nixpkgs` as
`six` has been bumped to `1.11`. When trying to inject a patched version
of our `six` package based on `six==1.10` you'll run into issues with
duplicated libraries in your closure as further build dependencies
(`pytest` in this case) use the latest `six` version. As Python 2.7 will
die in 2020 (https://pythonclock.org/) and patching around in the
dependencies of `pytest` to get `yowsup` running isn't worth the effort
in my opinion I decided to keep the Python 2.x build disabled for now.