Merge staging-next into staging

This commit is contained in:
Frederik Rietdijk 2020-11-04 09:28:07 +01:00
commit 10c57af49c
172 changed files with 7114 additions and 4476 deletions

View File

@ -19,6 +19,7 @@
<xi:include href="ios.section.xml" />
<xi:include href="java.xml" />
<xi:include href="lua.section.xml" />
<xi:include href="maven.section.xml" />
<xi:include href="node.section.xml" />
<xi:include href="ocaml.xml" />
<xi:include href="perl.xml" />

View File

@ -0,0 +1,354 @@
---
title: Maven
author: Farid Zakaria
date: 2020-10-15
---
# Maven
Maven is a well-known build tool for the Java ecosystem however it has some challenges when integrating into the Nix build system.
The following provides a list of common patterns with how to package a Maven project (or any JVM language that can export to Maven) as a Nix package.
For the purposes of this example let's consider a very basic Maven project with the following `pom.xml` with a single dependency on [emoji-java](https://github.com/vdurmont/emoji-java).
```xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>io.github.fzakaria</groupId>
<artifactId>maven-demo</artifactId>
<version>1.0</version>
<packaging>jar</packaging>
<name>NixOS Maven Demo</name>
<dependencies>
<dependency>
<groupId>com.vdurmont</groupId>
<artifactId>emoji-java</artifactId>
<version>5.1.1</version>
</dependency>
</dependencies>
</project>
```
Our main class file will be very simple:
```java
import com.vdurmont.emoji.EmojiParser;
public class Main {
public static void main(String[] args) {
String str = "NixOS :grinning: is super cool :smiley:!";
String result = EmojiParser.parseToUnicode(str);
System.out.println(result);
}
}
```
You find this demo project at https://github.com/fzakaria/nixos-maven-example
## Solving for dependencies
### buildMaven with NixOS/mvn2nix-maven-plugin
> ⚠️ Although `buildMaven` is the "blessed" way within nixpkgs, as of 2020, it hasn't seen much activity in quite a while.
`buildMaven` is an alternative method that tries to follow similar patterns of other programming languages by generating a lock file. It relies on the maven plugin [mvn2nix-maven-plugin](https://github.com/NixOS/mvn2nix-maven-plugin).
First you generate a `project-info.json` file using the maven plugin.
> This should be executed in the project's source repository or be told which `pom.xml` to execute with.
```bash
# run this step within the project's source repository
mvn org.nixos.mvn2nix:mvn2nix-maven-plugin:mvn2nix
cat project-info.json | jq | head
{
"project": {
"artifactId": "maven-demo",
"groupId": "org.nixos",
"version": "1.0",
"classifier": "",
"extension": "jar",
"dependencies": [
{
"artifactId": "maven-resources-plugin",
```
This file is then given to the `buildMaven` function, and it returns 2 attributes.
**`repo`**:
A Maven repository that is a symlink farm of all the dependencies found in the `project-info.json`
**`build`**:
A simple derivation that runs through `mvn compile` & `mvn package` to build the JAR. You may use this as inspiration for more complicated derivations.
Here is an [example](https://github.com/fzakaria/nixos-maven-example/blob/main/build-maven-repository.nix) of building the Maven repository
```nix
{ pkgs ? import <nixpkgs> { } }:
with pkgs;
(buildMaven ./project-info.json).repo
```
The benefit over the _double invocation_ as we will see below, is that the _/nix/store_ entry is a _linkFarm_ of every package, so that changes to your dependency set doesn't involve downloading everything from scratch.
```bash
tree $(nix-build --no-out-link build-maven-repository.nix) | head
/nix/store/g87va52nkc8jzbmi1aqdcf2f109r4dvn-maven-repository
├── antlr
│   └── antlr
│   └── 2.7.2
│   ├── antlr-2.7.2.jar -> /nix/store/d027c8f2cnmj5yrynpbq2s6wmc9cb559-antlr-2.7.2.jar
│   └── antlr-2.7.2.pom -> /nix/store/mv42fc5gizl8h5g5vpywz1nfiynmzgp2-antlr-2.7.2.pom
├── avalon-framework
│   └── avalon-framework
│   └── 4.1.3
│   ├── avalon-framework-4.1.3.jar -> /nix/store/iv5fp3955w3nq28ff9xfz86wvxbiw6n9-avalon-framework-4.1.3.jar
```
### Double Invocation
> ⚠️ This pattern is the simplest but may cause unnecessary rebuilds due to the output hash changing.
The double invocation is a _simple_ way to get around the problem that `nix-build` may be sandboxed and have no Internet connectivity.
It treats the entire Maven repository as a single source to be downloaded, relying on Maven's dependency resolution to satisfy the output hash. This is similar to fetchers like `fetchgit`, except it has to run a Maven build to determine what to download.
The first step will be to build the Maven project as a fixed-output derivation in order to collect the Maven repository -- below is an [example](https://github.com/fzakaria/nixos-maven-example/blob/main/double-invocation-repository.nix).
> Traditionally the Maven repository is at `~/.m2/repository`. We will override this to be the `$out` directory.
```nix
{ stdenv, maven }:
stdenv.mkDerivation {
name = "maven-repository";
buildInputs = [ maven ];
src = ./.; # or fetchFromGitHub, cleanSourceWith, etc
buildPhase = ''
mvn package -Dmaven.repo.local=$out
'';
# keep only *.{pom,jar,sha1,nbm} and delete all ephemeral files with lastModified timestamps inside
installPhase = ''
find $out -type f \
-name \*.lastUpdated -or \
-name resolver-status.properties -or \
-name _remote.repositories \
-delete
'';
# don't do any fixup
dontFixup = true;
outputHashAlgo = "sha256";
outputHashMode = "recursive";
# replace this with the correct SHA256
outputHash = stdenv.lib.fakeSha256;
}
```
The build will fail, and tell you the expected `outputHash` to place. When you've set the hash, the build will return with a `/nix/store` entry whose contents are the full Maven repository.
> Some additional files are deleted that would cause the output hash to change potentially on subsequent runs.
```bash
tree $(nix-build --no-out-link double-invocation-repository.nix) | head
/nix/store/8kicxzp98j68xyi9gl6jda67hp3c54fq-maven-repository
├── backport-util-concurrent
│   └── backport-util-concurrent
│   └── 3.1
│   ├── backport-util-concurrent-3.1.pom
│   └── backport-util-concurrent-3.1.pom.sha1
├── classworlds
│   └── classworlds
│   ├── 1.1
│   │   ├── classworlds-1.1.jar
```
If your package uses _SNAPSHOT_ dependencies or _version ranges_; there is a strong likelihood that over-time your output hash will change since the resolved dependencies may change. Hence this method is less recommended then using `buildMaven`.
## Building a JAR
Regardless of which strategy is chosen above, the step to build the derivation is the same.
```nix
{ stdenv, lib, maven, callPackage }:
# pick a repository derivation, here we will use buildMaven
let repository = callPackage ./build-maven-repository.nix { };
in stdenv.mkDerivation rec {
pname = "maven-demo";
version = "1.0";
src = builtins.fetchTarball "https://github.com/fzakaria/nixos-maven-example/archive/main.tar.gz";
buildInputs = [ maven ];
buildPhase = ''
echo "Using repository ${repository}"
mvn --offline -Dmaven.repo.local=${repository} package;
'';
installPhase = ''
install -Dm644 target/${pname}-${version}.jar $out/share/java
'';
}
```
> We place the library in `$out/share/java` since JDK package has a _stdenv setup hook_ that adds any JARs in the `share/java` directories of the build inputs to the CLASSPATH environment.
```bash
tree $(nix-build --no-out-link build-jar.nix)
/nix/store/7jw3xdfagkc2vw8wrsdv68qpsnrxgvky-maven-demo-1.0
└── share
└── java
└── maven-demo-1.0.jar
2 directories, 1 file
```
## Runnable JAR
The previous example builds a `jar` file but that's not a file one can run.
You need to use it with `java -jar $out/share/java/output.jar` and make sure to provide the required dependencies on the classpath.
The following explains how to use `makeWrapper` in order to make the derivation produce an executable that will run the JAR file you created.
We will use the same repository we built above (either _double invocation_ or _buildMaven_) to setup a CLASSPATH for our JAR.
The following two methods are more suited to Nix then building an [UberJar](https://imagej.net/Uber-JAR) which may be the more traditional approach.
### CLASSPATH
> This is ideal if you are providing a derivation for _nixpkgs_ and don't want to patch the project's `pom.xml`.
We will read the Maven repository and flatten it to a single list. This list will then be concatenated with the _CLASSPATH_ separator to create the full classpath.
We make sure to provide this classpath to the `makeWrapper`.
```nix
{ stdenv, lib, maven, callPackage, makeWrapper, jre }:
let
repository = callPackage ./build-maven-repository.nix { };
in stdenv.mkDerivation rec {
pname = "maven-demo";
version = "1.0";
src = builtins.fetchTarball
"https://github.com/fzakaria/nixos-maven-example/archive/main.tar.gz";
buildInputs = [ maven makeWrapper ];
buildPhase = ''
echo "Using repository ${repository}"
mvn --offline -Dmaven.repo.local=${repository} package;
'';
installPhase = ''
mkdir -p $out/bin
classpath=$(find ${repository} -name "*.jar" -printf ':%h/%f');
install -Dm644 target/${pname}-${version}.jar $out/share/java
# create a wrapper that will automatically set the classpath
# this should be the paths from the dependency derivation
makeWrapper ${jre}/bin/java $out/bin/${pname} \
--add-flags "-classpath $out/share/java/${pname}-${version}.jar:''${classpath#:}" \
--add-flags "Main"
'';
}
```
### MANIFEST file via Maven Plugin
> This is ideal if you are the project owner and want to change your `pom.xml` to set the CLASSPATH within it.
Augment the `pom.xml` to create a JAR with the following manifest:
```xml
<build>
<plugins>
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<classpathPrefix>../../repository/</classpathPrefix>
<classpathLayoutType>repository</classpathLayoutType>
<mainClass>Main</mainClass>
</manifest>
<manifestEntries>
<Class-Path>.</Class-Path>
</manifestEntries>
</archive>
</configuration>
</plugin>
</plugins>
</build>
```
The above plugin instructs the JAR to look for the necessary dependencies in the `lib/` relative folder. The layout of the folder is also in the _maven repository_ style.
```bash
unzip -q -c $(nix-build --no-out-link runnable-jar.nix)/share/java/maven-demo-1.0.jar META-INF/MANIFEST.MF
Manifest-Version: 1.0
Archiver-Version: Plexus Archiver
Built-By: nixbld
Class-Path: . ../../repository/com/vdurmont/emoji-java/5.1.1/emoji-jav
a-5.1.1.jar ../../repository/org/json/json/20170516/json-20170516.jar
Created-By: Apache Maven 3.6.3
Build-Jdk: 1.8.0_265
Main-Class: Main
```
We will modify the derivation above to add a symlink to our repository so that it's accessible to our JAR during the `installPhase`.
```nix
{ stdenv, lib, maven, callPackage, makeWrapper, jre }:
# pick a repository derivation, here we will use buildMaven
let repository = callPackage ./build-maven-repository.nix { };
in stdenv.mkDerivation rec {
pname = "maven-demo";
version = "1.0";
src = builtins.fetchTarball
"https://github.com/fzakaria/nixos-maven-example/archive/main.tar.gz";
buildInputs = [ maven makeWrapper ];
buildPhase = ''
echo "Using repository ${repository}"
mvn --offline -Dmaven.repo.local=${repository} package;
'';
installPhase = ''
mkdir -p $out/bin
# create a symbolic link for the repository directory
ln -s ${repository} $out/repository
install -Dm644 target/${pname}-${version}.jar $out/share/java
# create a wrapper that will automatically set the classpath
# this should be the paths from the dependency derivation
makeWrapper ${jre}/bin/java $out/bin/${pname} \
--add-flags "-jar $out/share/java/${pname}-${version}.jar"
'';
}
```
> Our script produces a dependency on `jre` rather than `jdk` to restrict the runtime closure necessary to run the application.
This will give you an executable shell-script that launches your JAR with all the dependencies available.
```bash
tree $(nix-build --no-out-link runnable-jar.nix)
/nix/store/8d4c3ibw8ynsn01ibhyqmc1zhzz75s26-maven-demo-1.0
├── bin
│   └── maven-demo
├── repository -> /nix/store/g87va52nkc8jzbmi1aqdcf2f109r4dvn-maven-repository
└── share
└── java
└── maven-demo-1.0.jar
$(nix-build --no-out-link --option tarball-ttl 1 runnable-jar.nix)/bin/maven-demo
NixOS 😀 is super cool 😃!
```

View File

@ -42,7 +42,19 @@
<itemizedlist>
<listitem>
<para />
<para>
<link xlink:href="https://www.keycloak.org/">Keycloak</link>,
an open source identity and access management server with
support for <link
xlink:href="https://openid.net/connect/">OpenID Connect</link>,
<link xlink:href="https://oauth.net/2/">OAUTH 2.0</link> and
<link xlink:href="https://en.wikipedia.org/wiki/SAML_2.0">SAML
2.0</link>.
</para>
<para>
See the <link linkend="module-services-keycloak">Keycloak
section of the NixOS manual</link> for more information.
</para>
</listitem>
</itemizedlist>
@ -112,6 +124,16 @@
<literal>/var/lib/powerdns</literal> to <literal>/run/pdns</literal>.
</para>
</listitem>
<listitem>
<para>
<package>riak-cs</package> package removed along with <varname>services.riak-cs</varname> module.
</para>
</listitem>
<listitem>
<para>
<package>stanchion</package> package removed along with <varname>services.stanchion</varname> module.
</para>
</listitem>
</itemizedlist>
</section>

View File

@ -290,8 +290,8 @@ in
hound = 259;
leaps = 260;
ipfs = 261;
stanchion = 262;
riak-cs = 263;
# stanchion = 262; # unused, removed 2020-10-14
# riak-cs = 263; # unused, removed 2020-10-14
infinoted = 264;
sickbeard = 265;
headphones = 266;
@ -593,8 +593,8 @@ in
hound = 259;
leaps = 260;
ipfs = 261;
stanchion = 262;
riak-cs = 263;
# stanchion = 262; # unused, removed 2020-10-14
# riak-cs = 263; # unused, removed 2020-10-14
infinoted = 264;
sickbeard = 265;
headphones = 266;

View File

@ -296,8 +296,6 @@
./services/databases/postgresql.nix
./services/databases/redis.nix
./services/databases/riak.nix
./services/databases/riak-cs.nix
./services/databases/stanchion.nix
./services/databases/victoriametrics.nix
./services/databases/virtuoso.nix
./services/desktops/accountsservice.nix
@ -394,6 +392,7 @@
./services/logging/logcheck.nix
./services/logging/logrotate.nix
./services/logging/logstash.nix
./services/logging/promtail.nix
./services/logging/rsyslogd.nix
./services/logging/syslog-ng.nix
./services/logging/syslogd.nix
@ -403,7 +402,6 @@
./services/mail/dovecot.nix
./services/mail/dspam.nix
./services/mail/exim.nix
./services/mail/freepops.nix
./services/mail/mail.nix
./services/mail/mailcatcher.nix
./services/mail/mailhog.nix
@ -865,6 +863,7 @@
./services/web-apps/ihatemoney
./services/web-apps/jirafeau.nix
./services/web-apps/jitsi-meet.nix
./services/web-apps/keycloak.nix
./services/web-apps/limesurvey.nix
./services/web-apps/mattermost.nix
./services/web-apps/mediawiki.nix

View File

@ -1,202 +0,0 @@
{ config, lib, pkgs, ... }:
with lib;
let
cfg = config.services.riak-cs;
in
{
###### interface
options = {
services.riak-cs = {
enable = mkEnableOption "riak-cs";
package = mkOption {
type = types.package;
default = pkgs.riak-cs;
defaultText = "pkgs.riak-cs";
example = literalExample "pkgs.riak-cs";
description = ''
Riak package to use.
'';
};
nodeName = mkOption {
type = types.str;
default = "riak-cs@127.0.0.1";
description = ''
Name of the Erlang node.
'';
};
anonymousUserCreation = mkOption {
type = types.bool;
default = false;
description = ''
Anonymous user creation.
'';
};
riakHost = mkOption {
type = types.str;
default = "127.0.0.1:8087";
description = ''
Name of riak hosting service.
'';
};
listener = mkOption {
type = types.str;
default = "127.0.0.1:8080";
description = ''
Name of Riak CS listening service.
'';
};
stanchionHost = mkOption {
type = types.str;
default = "127.0.0.1:8085";
description = ''
Name of stanchion hosting service.
'';
};
stanchionSsl = mkOption {
type = types.bool;
default = true;
description = ''
Tell stanchion to use SSL.
'';
};
distributedCookie = mkOption {
type = types.str;
default = "riak";
description = ''
Cookie for distributed node communication. All nodes in the
same cluster should use the same cookie or they will not be able to
communicate.
'';
};
dataDir = mkOption {
type = types.path;
default = "/var/db/riak-cs";
description = ''
Data directory for Riak CS.
'';
};
logDir = mkOption {
type = types.path;
default = "/var/log/riak-cs";
description = ''
Log directory for Riak CS.
'';
};
extraConfig = mkOption {
type = types.lines;
default = "";
description = ''
Additional text to be appended to <filename>riak-cs.conf</filename>.
'';
};
extraAdvancedConfig = mkOption {
type = types.lines;
default = "";
description = ''
Additional text to be appended to <filename>advanced.config</filename>.
'';
};
};
};
###### implementation
config = mkIf cfg.enable {
environment.systemPackages = [ cfg.package ];
environment.etc."riak-cs/riak-cs.conf".text = ''
nodename = ${cfg.nodeName}
distributed_cookie = ${cfg.distributedCookie}
platform_log_dir = ${cfg.logDir}
riak_host = ${cfg.riakHost}
listener = ${cfg.listener}
stanchion_host = ${cfg.stanchionHost}
anonymous_user_creation = ${if cfg.anonymousUserCreation then "on" else "off"}
${cfg.extraConfig}
'';
environment.etc."riak-cs/advanced.config".text = ''
${cfg.extraAdvancedConfig}
'';
users.users.riak-cs = {
name = "riak-cs";
uid = config.ids.uids.riak-cs;
group = "riak";
description = "Riak CS server user";
};
systemd.services.riak-cs = {
description = "Riak CS Server";
wantedBy = [ "multi-user.target" ];
after = [ "network.target" ];
path = [
pkgs.utillinux # for `logger`
pkgs.bash
];
environment.HOME = "${cfg.dataDir}";
environment.RIAK_CS_DATA_DIR = "${cfg.dataDir}";
environment.RIAK_CS_LOG_DIR = "${cfg.logDir}";
environment.RIAK_CS_ETC_DIR = "/etc/riak";
preStart = ''
if ! test -e ${cfg.logDir}; then
mkdir -m 0755 -p ${cfg.logDir}
chown -R riak-cs ${cfg.logDir}
fi
if ! test -e ${cfg.dataDir}; then
mkdir -m 0700 -p ${cfg.dataDir}
chown -R riak-cs ${cfg.dataDir}
fi
'';
serviceConfig = {
ExecStart = "${cfg.package}/bin/riak-cs console";
ExecStop = "${cfg.package}/bin/riak-cs stop";
StandardInput = "tty";
User = "riak-cs";
Group = "riak-cs";
PermissionsStartOnly = true;
# Give Riak a decent amount of time to clean up.
TimeoutStopSec = 120;
LimitNOFILE = 65536;
};
unitConfig.RequiresMountsFor = [
"${cfg.dataDir}"
"${cfg.logDir}"
"/etc/riak"
];
};
};
}

View File

@ -1,194 +0,0 @@
{ config, lib, pkgs, ... }:
with lib;
let
cfg = config.services.stanchion;
in
{
###### interface
options = {
services.stanchion = {
enable = mkEnableOption "stanchion";
package = mkOption {
type = types.package;
default = pkgs.stanchion;
defaultText = "pkgs.stanchion";
example = literalExample "pkgs.stanchion";
description = ''
Stanchion package to use.
'';
};
nodeName = mkOption {
type = types.str;
default = "stanchion@127.0.0.1";
description = ''
Name of the Erlang node.
'';
};
adminKey = mkOption {
type = types.str;
default = "";
description = ''
Name of admin user.
'';
};
adminSecret = mkOption {
type = types.str;
default = "";
description = ''
Name of admin secret
'';
};
riakHost = mkOption {
type = types.str;
default = "127.0.0.1:8087";
description = ''
Name of riak hosting service.
'';
};
listener = mkOption {
type = types.str;
default = "127.0.0.1:8085";
description = ''
Name of Riak CS listening service.
'';
};
stanchionHost = mkOption {
type = types.str;
default = "127.0.0.1:8085";
description = ''
Name of stanchion hosting service.
'';
};
distributedCookie = mkOption {
type = types.str;
default = "riak";
description = ''
Cookie for distributed node communication. All nodes in the
same cluster should use the same cookie or they will not be able to
communicate.
'';
};
dataDir = mkOption {
type = types.path;
default = "/var/db/stanchion";
description = ''
Data directory for Stanchion.
'';
};
logDir = mkOption {
type = types.path;
default = "/var/log/stanchion";
description = ''
Log directory for Stanchion.
'';
};
extraConfig = mkOption {
type = types.lines;
default = "";
description = ''
Additional text to be appended to <filename>stanchion.conf</filename>.
'';
};
};
};
###### implementation
config = mkIf cfg.enable {
environment.systemPackages = [ cfg.package ];
environment.etc."stanchion/advanced.config".text = ''
[{stanchion, []}].
'';
environment.etc."stanchion/stanchion.conf".text = ''
listener = ${cfg.listener}
riak_host = ${cfg.riakHost}
${optionalString (cfg.adminKey == "") "#"} admin.key=${optionalString (cfg.adminKey != "") cfg.adminKey}
${optionalString (cfg.adminSecret == "") "#"} admin.secret=${optionalString (cfg.adminSecret != "") cfg.adminSecret}
platform_bin_dir = ${pkgs.stanchion}/bin
platform_data_dir = ${cfg.dataDir}
platform_etc_dir = /etc/stanchion
platform_lib_dir = ${pkgs.stanchion}/lib
platform_log_dir = ${cfg.logDir}
nodename = ${cfg.nodeName}
distributed_cookie = ${cfg.distributedCookie}
${cfg.extraConfig}
'';
users.users.stanchion = {
name = "stanchion";
uid = config.ids.uids.stanchion;
group = "stanchion";
description = "Stanchion server user";
};
users.groups.stanchion.gid = config.ids.gids.stanchion;
systemd.tmpfiles.rules = [
"d '${cfg.logDir}' - stanchion stanchion --"
"d '${cfg.dataDir}' 0700 stanchion stanchion --"
];
systemd.services.stanchion = {
description = "Stanchion Server";
wantedBy = [ "multi-user.target" ];
after = [ "network.target" ];
path = [
pkgs.utillinux # for `logger`
pkgs.bash
];
environment.HOME = "${cfg.dataDir}";
environment.STANCHION_DATA_DIR = "${cfg.dataDir}";
environment.STANCHION_LOG_DIR = "${cfg.logDir}";
environment.STANCHION_ETC_DIR = "/etc/stanchion";
serviceConfig = {
ExecStart = "${cfg.package}/bin/stanchion console";
ExecStop = "${cfg.package}/bin/stanchion stop";
StandardInput = "tty";
User = "stanchion";
Group = "stanchion";
# Give Stanchion a decent amount of time to clean up.
TimeoutStopSec = 120;
LimitNOFILE = 65536;
};
unitConfig.RequiresMountsFor = [
"${cfg.dataDir}"
"${cfg.logDir}"
"/etc/stanchion"
];
};
};
}

View File

@ -0,0 +1,95 @@
{ config, lib, pkgs, ... }: with lib;
let
cfg = config.services.promtail;
prettyJSON = conf: pkgs.runCommandLocal "promtail-config.json" {} ''
echo '${builtins.toJSON conf}' | ${pkgs.buildPackages.jq}/bin/jq 'del(._module)' > $out
'';
in {
options.services.promtail = with types; {
enable = mkEnableOption "the Promtail ingresser";
configuration = mkOption {
type = with lib.types; let
valueType = nullOr (oneOf [
bool
int
float
str
(lazyAttrsOf valueType)
(listOf valueType)
]) // {
description = "JSON value";
emptyValue.value = {};
deprecationMessage = null;
};
in valueType;
description = ''
Specify the configuration for Promtail in Nix.
'';
};
extraFlags = mkOption {
type = listOf str;
default = [];
example = [ "--server.http-listen-port=3101" ];
description = ''
Specify a list of additional command line flags,
which get escaped and are then passed to Loki.
'';
};
};
config = mkIf cfg.enable {
services.promtail.configuration.positions.filename = mkDefault "/var/cache/promtail/positions.yaml";
systemd.services.promtail = {
description = "Promtail log ingress";
wantedBy = [ "multi-user.target" ];
stopIfChanged = false;
serviceConfig = {
Restart = "on-failure";
ExecStart = "${pkgs.grafana-loki}/bin/promtail -config.file=${prettyJSON cfg.configuration} ${escapeShellArgs cfg.extraFlags}";
ProtectSystem = "strict";
ProtectHome = true;
PrivateTmp = true;
PrivateDevices = true;
ProtectKernelTunables = true;
ProtectControlGroups = true;
RestrictSUIDSGID = true;
PrivateMounts = true;
CacheDirectory = "promtail";
User = "promtail";
Group = "promtail";
CapabilityBoundingSet = "";
NoNewPrivileges = true;
ProtectKernelModules = true;
SystemCallArchitectures = "native";
ProtectKernelLogs = true;
ProtectClock = true;
LockPersonality = true;
ProtectHostname = true;
RestrictRealtime = true;
MemoryDenyWriteExecute = true;
PrivateUsers = true;
} // (optionalAttrs (!pkgs.stdenv.isAarch64) { # FIXME: figure out why this breaks on aarch64
SystemCallFilter = "@system-service";
});
};
users.groups.promtail = {};
users.users.promtail = {
description = "Promtail service user";
isSystemUser = true;
group = "promtail";
};
};
}

View File

@ -1,89 +0,0 @@
{ config, lib, pkgs, ... }:
with lib;
let
cfg = config.services.mail.freepopsd;
in
{
options = {
services.mail.freepopsd = {
enable = mkOption {
default = false;
type = with types; bool;
description = ''
Enables Freepops, a POP3 webmail wrapper.
'';
};
port = mkOption {
default = 2000;
type = with types; uniq int;
description = ''
Port on which the pop server will listen.
'';
};
threads = mkOption {
default = 5;
type = with types; uniq int;
description = ''
Max simultaneous connections.
'';
};
bind = mkOption {
default = "0.0.0.0";
type = types.str;
description = ''
Bind over an IPv4 address instead of any.
'';
};
logFile = mkOption {
default = "/var/log/freepopsd";
example = "syslog";
type = types.str;
description = ''
Filename of the log file or syslog to rely on the logging daemon.
'';
};
suid = {
user = mkOption {
default = "nobody";
type = types.str;
description = ''
User name under which freepopsd will be after binding the port.
'';
};
group = mkOption {
default = "nogroup";
type = types.str;
description = ''
Group under which freepopsd will be after binding the port.
'';
};
};
};
};
config = mkIf cfg.enable {
systemd.services.freepopsd = {
description = "Freepopsd (webmail over POP3)";
after = [ "network.target" ];
wantedBy = [ "multi-user.target" ];
script = ''
${pkgs.freepops}/bin/freepopsd \
-p ${toString cfg.port} \
-t ${toString cfg.threads} \
-b ${cfg.bind} \
-vv -l ${cfg.logFile} \
-s ${cfg.suid.user}.${cfg.suid.group}
'';
};
};
}

View File

@ -44,6 +44,13 @@ in {
enable = mkEnableOption "Interplanetary File System (WARNING: may cause severe network degredation)";
package = mkOption {
type = types.package;
default = pkgs.ipfs;
defaultText = "pkgs.ipfs";
description = "Which IPFS package to use.";
};
user = mkOption {
type = types.str;
default = "ipfs";
@ -176,7 +183,7 @@ in {
###### implementation
config = mkIf cfg.enable {
environment.systemPackages = [ pkgs.ipfs ];
environment.systemPackages = [ cfg.package ];
environment.variables.IPFS_PATH = cfg.dataDir;
programs.fuse = mkIf cfg.autoMount {
@ -207,14 +214,14 @@ in {
"d '${cfg.ipnsMountDir}' - ${cfg.user} ${cfg.group} - -"
];
systemd.packages = [ pkgs.ipfs ];
systemd.packages = [ cfg.package ];
systemd.services.ipfs-init = {
description = "IPFS Initializer";
environment.IPFS_PATH = cfg.dataDir;
path = [ pkgs.ipfs ];
path = [ cfg.package ];
script = ''
if [[ ! -f ${cfg.dataDir}/config ]]; then
@ -239,7 +246,7 @@ in {
};
systemd.services.ipfs = {
path = [ "/run/wrappers" pkgs.ipfs ];
path = [ "/run/wrappers" cfg.package ];
environment.IPFS_PATH = cfg.dataDir;
wants = [ "ipfs-init.service" ];
@ -267,7 +274,7 @@ in {
cfg.extraConfig))
);
serviceConfig = {
ExecStart = ["" "${pkgs.ipfs}/bin/ipfs daemon ${ipfsFlags}"];
ExecStart = ["" "${cfg.package}/bin/ipfs daemon ${ipfsFlags}"];
User = cfg.user;
Group = cfg.group;
} // optionalAttrs (cfg.serviceFdlimit != null) { LimitNOFILE = cfg.serviceFdlimit; };

View File

@ -69,6 +69,11 @@ let
if-carrier-up = "";
}.${cfg.wait}}
${optionalString (config.networking.enableIPv6 == false) ''
# Don't solicit or accept IPv6 Router Advertisements and DHCPv6 if disabled IPv6
noipv6
''}
${cfg.extraConfig}
'';

View File

@ -6,6 +6,7 @@ let
cfg = config.services.chrony;
stateDir = "/var/lib/chrony";
driftFile = "${stateDir}/chrony.drift";
keyFile = "${stateDir}/chrony.keys";
configFile = pkgs.writeText "chrony.conf" ''
@ -16,7 +17,7 @@ let
"initstepslew ${toString cfg.initstepslew.threshold} ${concatStringsSep " " cfg.servers}"
}
driftfile ${stateDir}/chrony.drift
driftfile ${driftFile}
keyfile ${keyFile}
${optionalString (!config.time.hardwareClockInLocalTime) "rtconutc"}
@ -95,6 +96,7 @@ in
systemd.tmpfiles.rules = [
"d ${stateDir} 0755 chrony chrony - -"
"f ${driftFile} 0640 chrony chrony -"
"f ${keyFile} 0640 chrony chrony -"
];

View File

@ -1,31 +0,0 @@
#!/usr/bin/env -S nix-build --no-out-link
# Script to generate default streaming configurations for EPGStation. There's
# no need to run this script directly since generate.sh in the EPGStation
# package directory would run this script for you.
#
# Usage: ./generate | xargs cat > streaming.json
{ pkgs ? (import ../../../../.. {}) }:
let
sampleConfigPath = "${pkgs.epgstation.src}/config/config.sample.json";
sampleConfig = builtins.fromJSON (builtins.readFile sampleConfigPath);
streamingConfig = {
inherit (sampleConfig)
mpegTsStreaming
mpegTsViewer
liveHLS
liveMP4
liveWebM
recordedDownloader
recordedStreaming
recordedViewer
recordedHLS;
};
in
pkgs.runCommand "streaming.json" { nativeBuildInputs = [ pkgs.jq ]; } ''
jq . <<<'${builtins.toJSON streamingConfig}' > $out
''
# vim:set ft=nix:

View File

@ -1,119 +1,119 @@
{
"liveHLS": [
{
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 0 -map 0 -ignore_unknown -max_muxing_queue_size 1024 -f hls -hls_time 3 -hls_list_size 17 -hls_allow_cache 1 -hls_segment_filename %streamFileDir%/stream%streamNum%-%09d.ts -c:a aac -ar 48000 -b:a 192k -ac 2 -c:v libx264 -vf yadif,scale=-2:720 -b:v 3000k -preset veryfast -flags +loop-global_header %OUTPUT%",
"name": "720p"
"name": "720p",
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 0 -map 0 -ignore_unknown -max_muxing_queue_size 1024 -f hls -hls_time 3 -hls_list_size 17 -hls_allow_cache 1 -hls_segment_filename %streamFileDir%/stream%streamNum%-%09d.ts -c:a aac -ar 48000 -b:a 192k -ac 2 -c:v libx264 -vf yadif,scale=-2:720 -b:v 3000k -preset veryfast -flags +loop-global_header %OUTPUT%"
},
{
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 0 -map 0 -ignore_unknown -max_muxing_queue_size 1024 -f hls -hls_time 3 -hls_list_size 17 -hls_allow_cache 1 -hls_segment_filename %streamFileDir%/stream%streamNum%-%09d.ts -c:a aac -ar 48000 -b:a 128k -ac 2 -c:v libx264 -vf yadif,scale=-2:480 -b:v 1500k -preset veryfast -flags +loop-global_header %OUTPUT%",
"name": "480p"
"name": "480p",
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 0 -map 0 -ignore_unknown -max_muxing_queue_size 1024 -f hls -hls_time 3 -hls_list_size 17 -hls_allow_cache 1 -hls_segment_filename %streamFileDir%/stream%streamNum%-%09d.ts -c:a aac -ar 48000 -b:a 128k -ac 2 -c:v libx264 -vf yadif,scale=-2:480 -b:v 1500k -preset veryfast -flags +loop-global_header %OUTPUT%"
},
{
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 0 -map 0 -ignore_unknown -max_muxing_queue_size 1024 -f hls -hls_time 3 -hls_list_size 17 -hls_allow_cache 1 -hls_segment_filename %streamFileDir%/stream%streamNum%-%09d.ts -c:a aac -ar 48000 -b:a 48k -ac 2 -c:v libx264 -vf yadif,scale=-2:180 -b:v 100k -preset veryfast -maxrate 110k -bufsize 1000k -flags +loop-global_header %OUTPUT%",
"name": "180p"
"name": "180p",
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 0 -map 0 -ignore_unknown -max_muxing_queue_size 1024 -f hls -hls_time 3 -hls_list_size 17 -hls_allow_cache 1 -hls_segment_filename %streamFileDir%/stream%streamNum%-%09d.ts -c:a aac -ar 48000 -b:a 48k -ac 2 -c:v libx264 -vf yadif,scale=-2:180 -b:v 100k -preset veryfast -maxrate 110k -bufsize 1000k -flags +loop-global_header %OUTPUT%"
}
],
"liveMP4": [
{
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 0 -c:a aac -ar 48000 -b:a 192k -ac 2 -c:v libx264 -vf yadif,scale=-2:720 -b:v 3000k -profile:v baseline -preset veryfast -tune fastdecode,zerolatency -movflags frag_keyframe+empty_moov+faststart+default_base_moof -y -f mp4 pipe:1",
"name": "720p"
"name": "720p",
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 0 -c:a aac -ar 48000 -b:a 192k -ac 2 -c:v libx264 -vf yadif,scale=-2:720 -b:v 3000k -profile:v baseline -preset veryfast -tune fastdecode,zerolatency -movflags frag_keyframe+empty_moov+faststart+default_base_moof -y -f mp4 pipe:1"
},
{
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 0 -c:a aac -ar 48000 -b:a 128k -ac 2 -c:v libx264 -vf yadif,scale=-2:480 -b:v 1500k -profile:v baseline -preset veryfast -tune fastdecode,zerolatency -movflags frag_keyframe+empty_moov+faststart+default_base_moof -y -f mp4 pipe:1",
"name": "480p"
"name": "480p",
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 0 -c:a aac -ar 48000 -b:a 128k -ac 2 -c:v libx264 -vf yadif,scale=-2:480 -b:v 1500k -profile:v baseline -preset veryfast -tune fastdecode,zerolatency -movflags frag_keyframe+empty_moov+faststart+default_base_moof -y -f mp4 pipe:1"
}
],
"liveWebM": [
{
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 3 -c:a libvorbis -ar 48000 -b:a 192k -ac 2 -c:v libvpx-vp9 -vf yadif,scale=-2:720 -b:v 3000k -deadline realtime -speed 4 -cpu-used -8 -y -f webm pipe:1",
"name": "720p"
"name": "720p",
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 3 -c:a libvorbis -ar 48000 -b:a 192k -ac 2 -c:v libvpx-vp9 -vf yadif,scale=-2:720 -b:v 3000k -deadline realtime -speed 4 -cpu-used -8 -y -f webm pipe:1"
},
{
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 2 -c:a libvorbis -ar 48000 -b:a 128k -ac 2 -c:v libvpx-vp9 -vf yadif,scale=-2:480 -b:v 1500k -deadline realtime -speed 4 -cpu-used -8 -y -f webm pipe:1",
"name": "480p"
"name": "480p",
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 2 -c:a libvorbis -ar 48000 -b:a 128k -ac 2 -c:v libvpx-vp9 -vf yadif,scale=-2:480 -b:v 1500k -deadline realtime -speed 4 -cpu-used -8 -y -f webm pipe:1"
}
],
"mpegTsStreaming": [
{
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 0 -c:a aac -ar 48000 -b:a 192k -ac 2 -c:v libx264 -vf yadif,scale=-2:720 -b:v 3000k -preset veryfast -y -f mpegts pipe:1",
"name": "720p"
"name": "720p",
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 0 -c:a aac -ar 48000 -b:a 192k -ac 2 -c:v libx264 -vf yadif,scale=-2:720 -b:v 3000k -preset veryfast -y -f mpegts pipe:1"
},
{
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 0 -c:a aac -ar 48000 -b:a 128k -ac 2 -c:v libx264 -vf yadif,scale=-2:480 -b:v 1500k -preset veryfast -y -f mpegts pipe:1",
"name": "480p"
"name": "480p",
"cmd": "%FFMPEG% -re -dual_mono_mode main -i pipe:0 -sn -threads 0 -c:a aac -ar 48000 -b:a 128k -ac 2 -c:v libx264 -vf yadif,scale=-2:480 -b:v 1500k -preset veryfast -y -f mpegts pipe:1"
},
{
"name": "Original"
}
],
"mpegTsViewer": {
"android": "intent://ADDRESS#Intent;package=com.mxtech.videoplayer.ad;type=video;scheme=http;end",
"ios": "vlc-x-callback://x-callback-url/stream?url=http://ADDRESS"
"ios": "vlc-x-callback://x-callback-url/stream?url=http://ADDRESS",
"android": "intent://ADDRESS#Intent;package=com.mxtech.videoplayer.ad;type=video;scheme=http;end"
},
"recordedDownloader": {
"android": "intent://ADDRESS#Intent;package=com.dv.adm;type=video;scheme=http;end",
"ios": "vlc-x-callback://x-callback-url/download?url=http://ADDRESS&filename=FILENAME"
"ios": "vlc-x-callback://x-callback-url/download?url=http://ADDRESS&filename=FILENAME",
"android": "intent://ADDRESS#Intent;package=com.dv.adm;type=video;scheme=http;end"
},
"recordedHLS": [
"recordedStreaming": {
"webm": [
{
"cmd": "%FFMPEG% -dual_mono_mode main -i %INPUT% -sn -threads 0 -map 0 -ignore_unknown -max_muxing_queue_size 1024 -f hls -hls_time 3 -hls_list_size 0 -hls_allow_cache 1 -hls_segment_filename %streamFileDir%/stream%streamNum%-%09d.ts -c:a aac -ar 48000 -b:a 192k -ac 2 -c:v libx264 -vf yadif,scale=-2:720 -b:v 3000k -preset veryfast -flags +loop-global_header %OUTPUT%",
"name": "720p"
"name": "720p",
"cmd": "%FFMPEG% -dual_mono_mode main %RE% -i pipe:0 -sn -threads 3 -c:a libvorbis -ar 48000 -ac 2 -c:v libvpx-vp9 -vf yadif,scale=-2:720 %VB% %VBUFFER% %AB% %ABUFFER% -deadline realtime -speed 4 -cpu-used -8 -y -f webm pipe:1",
"vb": "3000k",
"ab": "192k"
},
{
"cmd": "%FFMPEG% -dual_mono_mode main -i %INPUT% -sn -threads 0 -map 0 -ignore_unknown -max_muxing_queue_size 1024 -f hls -hls_time 3 -hls_list_size 0 -hls_allow_cache 1 -hls_segment_filename %streamFileDir%/stream%streamNum%-%09d.ts -c:a aac -ar 48000 -b:a 128k -ac 2 -c:v libx264 -vf yadif,scale=-2:480 -b:v 1500k -preset veryfast -flags +loop-global_header %OUTPUT%",
"name": "480p"
},
{
"cmd": "%FFMPEG% -dual_mono_mode main -i %INPUT% -sn -map 0 -ignore_unknown -max_muxing_queue_size 1024 -f hls -hls_time 3 -hls_list_size 0 -hls_allow_cache 1 -hls_segment_type fmp4 -hls_fmp4_init_filename stream%streamNum%-init.mp4 -hls_segment_filename stream%streamNum%-%09d.m4s -c:a aac -ar 48000 -b:a 128k -ac 2 -c:v libx265 -vf yadif,scale=-2:480 -b:v 350k -preset veryfast -tag:v hvc1 %OUTPUT%",
"name": "480p(h265)"
"name": "360p",
"cmd": "%FFMPEG% -dual_mono_mode main %RE% -i pipe:0 -sn -threads 2 -c:a libvorbis -ar 48000 -ac 2 -c:v libvpx-vp9 -vf yadif,scale=-2:360 %VB% %VBUFFER% %AB% %ABUFFER% -deadline realtime -speed 4 -cpu-used -8 -y -f webm pipe:1",
"vb": "1500k",
"ab": "128k"
}
],
"recordedStreaming": {
"mp4": [
{
"ab": "192k",
"cmd": "%FFMPEG% -dual_mono_mode main %RE% -i pipe:0 -sn -threads 0 -c:a aac -ar 48000 -ac 2 -c:v libx264 -vf yadif,scale=-2:720 %VB% %VBUFFER% %AB% %ABUFFER% -profile:v baseline -preset veryfast -tune fastdecode,zerolatency -movflags frag_keyframe+empty_moov+faststart+default_base_moof -y -f mp4 pipe:1",
"name": "720p",
"vb": "3000k"
"cmd": "%FFMPEG% -dual_mono_mode main %RE% -i pipe:0 -sn -threads 0 -c:a aac -ar 48000 -ac 2 -c:v libx264 -vf yadif,scale=-2:720 %VB% %VBUFFER% %AB% %ABUFFER% -profile:v baseline -preset veryfast -tune fastdecode,zerolatency -movflags frag_keyframe+empty_moov+faststart+default_base_moof -y -f mp4 pipe:1",
"vb": "3000k",
"ab": "192k"
},
{
"ab": "128k",
"cmd": "%FFMPEG% -dual_mono_mode main %RE% -i pipe:0 -sn -threads 0 -c:a aac -ar 48000 -ac 2 -c:v libx264 -vf yadif,scale=-2:360 %VB% %VBUFFER% %AB% %ABUFFER% -profile:v baseline -preset veryfast -tune fastdecode,zerolatency -movflags frag_keyframe+empty_moov+faststart+default_base_moof -y -f mp4 pipe:1",
"name": "360p",
"vb": "1500k"
"cmd": "%FFMPEG% -dual_mono_mode main %RE% -i pipe:0 -sn -threads 0 -c:a aac -ar 48000 -ac 2 -c:v libx264 -vf yadif,scale=-2:360 %VB% %VBUFFER% %AB% %ABUFFER% -profile:v baseline -preset veryfast -tune fastdecode,zerolatency -movflags frag_keyframe+empty_moov+faststart+default_base_moof -y -f mp4 pipe:1",
"vb": "1500k",
"ab": "128k"
}
],
"mpegTs": [
{
"ab": "192k",
"cmd": "%FFMPEG% -dual_mono_mode main %RE% -i pipe:0 -sn -threads 0 -c:a aac -ar 48000 -ac 2 -c:v libx264 -vf yadif,scale=-2:720 %VB% %VBUFFER% %AB% %ABUFFER% -profile:v baseline -preset veryfast -tune fastdecode,zerolatency -y -f mpegts pipe:1",
"name": "720p (H.264)",
"vb": "3000k"
"cmd": "%FFMPEG% -dual_mono_mode main %RE% -i pipe:0 -sn -threads 0 -c:a aac -ar 48000 -ac 2 -c:v libx264 -vf yadif,scale=-2:720 %VB% %VBUFFER% %AB% %ABUFFER% -profile:v baseline -preset veryfast -tune fastdecode,zerolatency -y -f mpegts pipe:1",
"vb": "3000k",
"ab": "192k"
},
{
"ab": "128k",
"cmd": "%FFMPEG% -dual_mono_mode main %RE% -i pipe:0 -sn -threads 0 -c:a aac -ar 48000 -ac 2 -c:v libx264 -vf yadif,scale=-2:360 %VB% %VBUFFER% %AB% %ABUFFER% -profile:v baseline -preset veryfast -tune fastdecode,zerolatency -y -f mpegts pipe:1",
"name": "360p (H.264)",
"vb": "1500k"
}
],
"webm": [
{
"ab": "192k",
"cmd": "%FFMPEG% -dual_mono_mode main %RE% -i pipe:0 -sn -threads 3 -c:a libvorbis -ar 48000 -ac 2 -c:v libvpx-vp9 -vf yadif,scale=-2:720 %VB% %VBUFFER% %AB% %ABUFFER% -deadline realtime -speed 4 -cpu-used -8 -y -f webm pipe:1",
"name": "720p",
"vb": "3000k"
},
{
"ab": "128k",
"cmd": "%FFMPEG% -dual_mono_mode main %RE% -i pipe:0 -sn -threads 2 -c:a libvorbis -ar 48000 -ac 2 -c:v libvpx-vp9 -vf yadif,scale=-2:360 %VB% %VBUFFER% %AB% %ABUFFER% -deadline realtime -speed 4 -cpu-used -8 -y -f webm pipe:1",
"name": "360p",
"vb": "1500k"
"cmd": "%FFMPEG% -dual_mono_mode main %RE% -i pipe:0 -sn -threads 0 -c:a aac -ar 48000 -ac 2 -c:v libx264 -vf yadif,scale=-2:360 %VB% %VBUFFER% %AB% %ABUFFER% -profile:v baseline -preset veryfast -tune fastdecode,zerolatency -y -f mpegts pipe:1",
"vb": "1500k",
"ab": "128k"
}
]
},
"recordedHLS": [
{
"name": "720p",
"cmd": "%FFMPEG% -dual_mono_mode main -i %INPUT% -sn -threads 0 -map 0 -ignore_unknown -max_muxing_queue_size 1024 -f hls -hls_time 3 -hls_list_size 0 -hls_allow_cache 1 -hls_segment_filename %streamFileDir%/stream%streamNum%-%09d.ts -c:a aac -ar 48000 -b:a 192k -ac 2 -c:v libx264 -vf yadif,scale=-2:720 -b:v 3000k -preset veryfast -flags +loop-global_header %OUTPUT%"
},
{
"name": "480p",
"cmd": "%FFMPEG% -dual_mono_mode main -i %INPUT% -sn -threads 0 -map 0 -ignore_unknown -max_muxing_queue_size 1024 -f hls -hls_time 3 -hls_list_size 0 -hls_allow_cache 1 -hls_segment_filename %streamFileDir%/stream%streamNum%-%09d.ts -c:a aac -ar 48000 -b:a 128k -ac 2 -c:v libx264 -vf yadif,scale=-2:480 -b:v 1500k -preset veryfast -flags +loop-global_header %OUTPUT%"
},
{
"name": "480p(h265)",
"cmd": "%FFMPEG% -dual_mono_mode main -i %INPUT% -sn -map 0 -ignore_unknown -max_muxing_queue_size 1024 -f hls -hls_time 3 -hls_list_size 0 -hls_allow_cache 1 -hls_segment_type fmp4 -hls_fmp4_init_filename stream%streamNum%-init.mp4 -hls_segment_filename stream%streamNum%-%09d.m4s -c:a aac -ar 48000 -b:a 128k -ac 2 -c:v libx265 -vf yadif,scale=-2:480 -b:v 350k -preset veryfast -tag:v hvc1 %OUTPUT%"
}
],
"recordedViewer": {
"android": "intent://ADDRESS#Intent;package=com.mxtech.videoplayer.ad;type=video;scheme=http;end",
"ios": "infuse://x-callback-url/play?url=http://ADDRESS"
"ios": "infuse://x-callback-url/play?url=http://ADDRESS",
"android": "intent://ADDRESS#Intent;package=com.mxtech.videoplayer.ad;type=video;scheme=http;end"
}
}

View File

@ -0,0 +1,692 @@
{ config, pkgs, lib, ... }:
let
cfg = config.services.keycloak;
in
{
options.services.keycloak = {
enable = lib.mkOption {
type = lib.types.bool;
default = false;
example = true;
description = ''
Whether to enable the Keycloak identity and access management
server.
'';
};
bindAddress = lib.mkOption {
type = lib.types.str;
default = "\${jboss.bind.address:0.0.0.0}";
example = "127.0.0.1";
description = ''
On which address Keycloak should accept new connections.
A special syntax can be used to allow command line Java system
properties to override the value: ''${property.name:value}
'';
};
httpPort = lib.mkOption {
type = lib.types.str;
default = "\${jboss.http.port:80}";
example = "8080";
description = ''
On which port Keycloak should listen for new HTTP connections.
A special syntax can be used to allow command line Java system
properties to override the value: ''${property.name:value}
'';
};
httpsPort = lib.mkOption {
type = lib.types.str;
default = "\${jboss.https.port:443}";
example = "8443";
description = ''
On which port Keycloak should listen for new HTTPS connections.
A special syntax can be used to allow command line Java system
properties to override the value: ''${property.name:value}
'';
};
frontendUrl = lib.mkOption {
type = lib.types.str;
example = "keycloak.example.com/auth";
description = ''
The public URL used as base for all frontend requests. Should
normally include a trailing <literal>/auth</literal>.
See <link xlink:href="https://www.keycloak.org/docs/latest/server_installation/#_hostname">the
Hostname section of the Keycloak server installation
manual</link> for more information.
'';
};
forceBackendUrlToFrontendUrl = lib.mkOption {
type = lib.types.bool;
default = false;
example = true;
description = ''
Whether Keycloak should force all requests to go through the
frontend URL configured in <xref
linkend="opt-services.keycloak.frontendUrl" />. By default,
Keycloak allows backend requests to instead use its local
hostname or IP address and may also advertise it to clients
through its OpenID Connect Discovery endpoint.
See <link
xlink:href="https://www.keycloak.org/docs/latest/server_installation/#_hostname">the
Hostname section of the Keycloak server installation
manual</link> for more information.
'';
};
certificatePrivateKeyBundle = lib.mkOption {
type = lib.types.nullOr lib.types.path;
default = null;
example = "/run/keys/ssl_cert";
description = ''
The path to a PEM formatted bundle of the private key and
certificate to use for TLS connections.
This should be a string, not a Nix path, since Nix paths are
copied into the world-readable Nix store.
'';
};
databaseType = lib.mkOption {
type = lib.types.enum [ "mysql" "postgresql" ];
default = "postgresql";
example = "mysql";
description = ''
The type of database Keycloak should connect to.
'';
};
databaseHost = lib.mkOption {
type = lib.types.str;
default = "localhost";
description = ''
Hostname of the database to connect to.
'';
};
databasePort =
let
dbPorts = {
postgresql = 5432;
mysql = 3306;
};
in
lib.mkOption {
type = lib.types.port;
default = dbPorts.${cfg.databaseType};
description = ''
Port of the database to connect to.
'';
};
databaseUseSSL = lib.mkOption {
type = lib.types.bool;
default = cfg.databaseHost != "localhost";
description = ''
Whether the database connection should be secured by SSL /
TLS.
'';
};
databaseCaCert = lib.mkOption {
type = lib.types.nullOr lib.types.path;
default = null;
description = ''
The SSL / TLS CA certificate that verifies the identity of the
database server.
Required when PostgreSQL is used and SSL is turned on.
For MySQL, if left at <literal>null</literal>, the default
Java keystore is used, which should suffice if the server
certificate is issued by an official CA.
'';
};
databaseCreateLocally = lib.mkOption {
type = lib.types.bool;
default = true;
description = ''
Whether a database should be automatically created on the
local host. Set this to false if you plan on provisioning a
local database yourself. This has no effect if
services.keycloak.databaseHost is customized.
'';
};
databaseUsername = lib.mkOption {
type = lib.types.str;
default = "keycloak";
description = ''
Username to use when connecting to an external or manually
provisioned database; has no effect when a local database is
automatically provisioned.
'';
};
databasePasswordFile = lib.mkOption {
type = lib.types.path;
example = "/run/keys/db_password";
description = ''
File containing the database password.
This should be a string, not a Nix path, since Nix paths are
copied into the world-readable Nix store.
'';
};
package = lib.mkOption {
type = lib.types.package;
default = pkgs.keycloak;
description = ''
Keycloak package to use.
'';
};
initialAdminPassword = lib.mkOption {
type = lib.types.str;
default = "changeme";
description = ''
Initial password set for the <literal>admin</literal>
user. The password is not stored safely and should be changed
immediately in the admin panel.
'';
};
extraConfig = lib.mkOption {
type = lib.types.attrs;
default = { };
example = lib.literalExample ''
{
"subsystem=keycloak-server" = {
"spi=hostname" = {
"provider=default" = null;
"provider=fixed" = {
enabled = true;
properties.hostname = "keycloak.example.com";
};
default-provider = "fixed";
};
};
}
'';
description = ''
Additional Keycloak configuration options to set in
<literal>standalone.xml</literal>.
Options are expressed as a Nix attribute set which matches the
structure of the jboss-cli configuration. The configuration is
effectively overlayed on top of the default configuration
shipped with Keycloak. To remove existing nodes and undefine
attributes from the default configuration, set them to
<literal>null</literal>.
The example configuration does the equivalent of the following
script, which removes the hostname provider
<literal>default</literal>, adds the deprecated hostname
provider <literal>fixed</literal> and defines it the default:
<programlisting>
/subsystem=keycloak-server/spi=hostname/provider=default:remove()
/subsystem=keycloak-server/spi=hostname/provider=fixed:add(enabled = true, properties = { hostname = "keycloak.example.com" })
/subsystem=keycloak-server/spi=hostname:write-attribute(name=default-provider, value="fixed")
</programlisting>
You can discover available options by using the <link
xlink:href="http://docs.wildfly.org/21/Admin_Guide.html#Command_Line_Interface">jboss-cli.sh</link>
program and by referring to the <link
xlink:href="https://www.keycloak.org/docs/latest/server_installation/index.html">Keycloak
Server Installation and Configuration Guide</link>.
'';
};
};
config =
let
# We only want to create a database if we're actually going to connect to it.
databaseActuallyCreateLocally = cfg.databaseCreateLocally && cfg.databaseHost == "localhost";
createLocalPostgreSQL = databaseActuallyCreateLocally && cfg.databaseType == "postgresql";
createLocalMySQL = databaseActuallyCreateLocally && cfg.databaseType == "mysql";
mySqlCaKeystore = pkgs.runCommandNoCC "mysql-ca-keystore" {} ''
${pkgs.jre}/bin/keytool -importcert -trustcacerts -alias MySQLCACert -file ${cfg.databaseCaCert} -keystore $out -storepass notsosecretpassword -noprompt
'';
keycloakConfig' = builtins.foldl' lib.recursiveUpdate {
"interface=public".inet-address = cfg.bindAddress;
"socket-binding-group=standard-sockets"."socket-binding=http".port = cfg.httpPort;
"subsystem=keycloak-server"."spi=hostname" = {
"provider=default" = {
enabled = true;
properties = {
inherit (cfg) frontendUrl forceBackendUrlToFrontendUrl;
};
};
};
"subsystem=datasources"."data-source=KeycloakDS" = {
max-pool-size = "20";
user-name = if databaseActuallyCreateLocally then "keycloak" else cfg.databaseUsername;
password = "@db-password@";
};
} [
(lib.optionalAttrs (cfg.databaseType == "postgresql") {
"subsystem=datasources" = {
"jdbc-driver=postgresql" = {
driver-module-name = "org.postgresql";
driver-name = "postgresql";
driver-xa-datasource-class-name = "org.postgresql.xa.PGXADataSource";
};
"data-source=KeycloakDS" = {
connection-url = "jdbc:postgresql://${cfg.databaseHost}:${builtins.toString cfg.databasePort}/keycloak";
driver-name = "postgresql";
"connection-properties=ssl".value = lib.boolToString cfg.databaseUseSSL;
} // (lib.optionalAttrs (cfg.databaseCaCert != null) {
"connection-properties=sslrootcert".value = cfg.databaseCaCert;
"connection-properties=sslmode".value = "verify-ca";
});
};
})
(lib.optionalAttrs (cfg.databaseType == "mysql") {
"subsystem=datasources" = {
"jdbc-driver=mysql" = {
driver-module-name = "com.mysql";
driver-name = "mysql";
driver-class-name = "com.mysql.jdbc.Driver";
};
"data-source=KeycloakDS" = {
connection-url = "jdbc:mysql://${cfg.databaseHost}:${builtins.toString cfg.databasePort}/keycloak";
driver-name = "mysql";
"connection-properties=useSSL".value = lib.boolToString cfg.databaseUseSSL;
"connection-properties=requireSSL".value = lib.boolToString cfg.databaseUseSSL;
"connection-properties=verifyServerCertificate".value = lib.boolToString cfg.databaseUseSSL;
"connection-properties=characterEncoding".value = "UTF-8";
valid-connection-checker-class-name = "org.jboss.jca.adapters.jdbc.extensions.mysql.MySQLValidConnectionChecker";
validate-on-match = true;
exception-sorter-class-name = "org.jboss.jca.adapters.jdbc.extensions.mysql.MySQLExceptionSorter";
} // (lib.optionalAttrs (cfg.databaseCaCert != null) {
"connection-properties=trustCertificateKeyStoreUrl".value = "file:${mySqlCaKeystore}";
"connection-properties=trustCertificateKeyStorePassword".value = "notsosecretpassword";
});
};
})
(lib.optionalAttrs (cfg.certificatePrivateKeyBundle != null) {
"socket-binding-group=standard-sockets"."socket-binding=https".port = cfg.httpsPort;
"core-service=management"."security-realm=UndertowRealm"."server-identity=ssl" = {
keystore-path = "/run/keycloak/ssl/certificate_private_key_bundle.p12";
keystore-password = "notsosecretpassword";
};
"subsystem=undertow"."server=default-server"."https-listener=https".security-realm = "UndertowRealm";
})
cfg.extraConfig
];
/* Produces a JBoss CLI script that creates paths and sets
attributes matching those described by `attrs`. When the
script is run, the existing settings are effectively overlayed
by those from `attrs`. Existing attributes can be unset by
defining them `null`.
JBoss paths and attributes / maps are distinguished by their
name, where paths follow a `key=value` scheme.
Example:
mkJbossScript {
"subsystem=keycloak-server"."spi=hostname" = {
"provider=fixed" = null;
"provider=default" = {
enabled = true;
properties = {
inherit frontendUrl;
forceBackendUrlToFrontendUrl = false;
};
};
};
}
=> ''
if (outcome != success) of /:read-resource()
/:add()
end-if
if (outcome != success) of /subsystem=keycloak-server:read-resource()
/subsystem=keycloak-server:add()
end-if
if (outcome != success) of /subsystem=keycloak-server/spi=hostname:read-resource()
/subsystem=keycloak-server/spi=hostname:add()
end-if
if (outcome != success) of /subsystem=keycloak-server/spi=hostname/provider=default:read-resource()
/subsystem=keycloak-server/spi=hostname/provider=default:add(enabled = true, properties = { forceBackendUrlToFrontendUrl = false, frontendUrl = "https://keycloak.example.com/auth" })
end-if
if (result != true) of /subsystem=keycloak-server/spi=hostname/provider=default:read-attribute(name="enabled")
/subsystem=keycloak-server/spi=hostname/provider=default:write-attribute(name=enabled, value=true)
end-if
if (result != false) of /subsystem=keycloak-server/spi=hostname/provider=default:read-attribute(name="properties.forceBackendUrlToFrontendUrl")
/subsystem=keycloak-server/spi=hostname/provider=default:write-attribute(name=properties.forceBackendUrlToFrontendUrl, value=false)
end-if
if (result != "https://keycloak.example.com/auth") of /subsystem=keycloak-server/spi=hostname/provider=default:read-attribute(name="properties.frontendUrl")
/subsystem=keycloak-server/spi=hostname/provider=default:write-attribute(name=properties.frontendUrl, value="https://keycloak.example.com/auth")
end-if
if (outcome != success) of /subsystem=keycloak-server/spi=hostname/provider=fixed:read-resource()
/subsystem=keycloak-server/spi=hostname/provider=fixed:remove()
end-if
''
*/
mkJbossScript = attrs:
let
/* From a JBoss path and an attrset, produces a JBoss CLI
snippet that writes the corresponding attributes starting
at `path`. Recurses down into subattrsets as necessary,
producing the variable name from its full path in the
attrset.
Example:
writeAttributes "/subsystem=keycloak-server/spi=hostname/provider=default" {
enabled = true;
properties = {
forceBackendUrlToFrontendUrl = false;
frontendUrl = "https://keycloak.example.com/auth";
};
}
=> ''
if (result != true) of /subsystem=keycloak-server/spi=hostname/provider=default:read-attribute(name="enabled")
/subsystem=keycloak-server/spi=hostname/provider=default:write-attribute(name=enabled, value=true)
end-if
if (result != false) of /subsystem=keycloak-server/spi=hostname/provider=default:read-attribute(name="properties.forceBackendUrlToFrontendUrl")
/subsystem=keycloak-server/spi=hostname/provider=default:write-attribute(name=properties.forceBackendUrlToFrontendUrl, value=false)
end-if
if (result != "https://keycloak.example.com/auth") of /subsystem=keycloak-server/spi=hostname/provider=default:read-attribute(name="properties.frontendUrl")
/subsystem=keycloak-server/spi=hostname/provider=default:write-attribute(name=properties.frontendUrl, value="https://keycloak.example.com/auth")
end-if
''
*/
writeAttributes = path: set:
let
# JBoss expressions like `${var}` need to be prefixed
# with `expression` to evaluate.
prefixExpression = string:
let
match = (builtins.match ''"\$\{.*}"'' string);
in
if match != null then
"expression " + string
else
string;
writeAttribute = attribute: value:
let
type = builtins.typeOf value;
in
if type == "set" then
let
names = builtins.attrNames value;
in
builtins.foldl' (text: name: text + (writeAttribute "${attribute}.${name}" value.${name})) "" names
else if value == null then ''
if (outcome == success) of ${path}:read-attribute(name="${attribute}")
${path}:undefine-attribute(name="${attribute}")
end-if
''
else if builtins.elem type [ "string" "path" "bool" ] then
let
value' = if type == "bool" then lib.boolToString value else ''"${value}"'';
in ''
if (result != ${prefixExpression value'}) of ${path}:read-attribute(name="${attribute}")
${path}:write-attribute(name=${attribute}, value=${value'})
end-if
''
else throw "Unsupported type '${type}' for path '${path}'!";
in
lib.concatStrings
(lib.mapAttrsToList
(attribute: value: (writeAttribute attribute value))
set);
/* Produces an argument list for the JBoss `add()` function,
which adds a JBoss path and takes as its arguments the
required subpaths and attributes.
Example:
makeArgList {
enabled = true;
properties = {
forceBackendUrlToFrontendUrl = false;
frontendUrl = "https://keycloak.example.com/auth";
};
}
=> ''
enabled = true, properties = { forceBackendUrlToFrontendUrl = false, frontendUrl = "https://keycloak.example.com/auth" }
''
*/
makeArgList = set:
let
makeArg = attribute: value:
let
type = builtins.typeOf value;
in
if type == "set" then
"${attribute} = { " + (makeArgList value) + " }"
else if builtins.elem type [ "string" "path" "bool" ] then
"${attribute} = ${if type == "bool" then lib.boolToString value else ''"${value}"''}"
else if value == null then
""
else
throw "Unsupported type '${type}' for attribute '${attribute}'!";
in
lib.concatStringsSep ", " (lib.mapAttrsToList makeArg set);
/* Recurses into the `attrs` attrset, beginning at the path
resolved from `state.path ++ node`; if `node` is `null`,
starts from `state.path`. Only subattrsets that are JBoss
paths, i.e. follows the `key=value` format, are recursed
into - the rest are considered JBoss attributes / maps.
*/
recurse = state: node:
let
path = state.path ++ (lib.optional (node != null) node);
isPath = name:
let
value = lib.getAttrFromPath (path ++ [ name ]) attrs;
in
if (builtins.match ".*([=]).*" name) == [ "=" ] then
if builtins.isAttrs value || value == null then
true
else
throw "Parsing path '${lib.concatStringsSep "." (path ++ [ name ])}' failed: JBoss attributes cannot contain '='!"
else
false;
jbossPath = "/" + (lib.concatStringsSep "/" path);
nodeValue = lib.getAttrFromPath path attrs;
children = if !builtins.isAttrs nodeValue then {} else nodeValue;
subPaths = builtins.filter isPath (builtins.attrNames children);
jbossAttrs = lib.filterAttrs (name: _: !(isPath name)) children;
in
state // {
text = state.text + (
if nodeValue != null then ''
if (outcome != success) of ${jbossPath}:read-resource()
${jbossPath}:add(${makeArgList jbossAttrs})
end-if
'' + (writeAttributes jbossPath jbossAttrs)
else ''
if (outcome == success) of ${jbossPath}:read-resource()
${jbossPath}:remove()
end-if
'') + (builtins.foldl' recurse { text = ""; inherit path; } subPaths).text;
};
in
(recurse { text = ""; path = []; } null).text;
jbossCliScript = pkgs.writeText "jboss-cli-script" (mkJbossScript keycloakConfig');
keycloakConfig = pkgs.runCommandNoCC "keycloak-config" {} ''
export JBOSS_BASE_DIR="$(pwd -P)";
export JBOSS_MODULEPATH="${cfg.package}/modules";
export JBOSS_LOG_DIR="$JBOSS_BASE_DIR/log";
cp -r ${cfg.package}/standalone/configuration .
chmod -R u+rwX ./configuration
mkdir -p {deployments,ssl}
"${cfg.package}/bin/standalone.sh"&
attempt=1
max_attempts=30
while ! ${cfg.package}/bin/jboss-cli.sh --connect ':read-attribute(name=server-state)'; do
if [[ "$attempt" == "$max_attempts" ]]; then
echo "ERROR: Could not connect to Keycloak after $attempt attempts! Failing.." >&2
exit 1
fi
echo "Keycloak not fully started yet, retrying.. ($attempt/$max_attempts)"
sleep 1
(( attempt++ ))
done
${cfg.package}/bin/jboss-cli.sh --connect --file=${jbossCliScript} --echo-command
cp configuration/standalone.xml $out
'';
in
lib.mkIf cfg.enable {
assertions = [
{
assertion = (cfg.databaseUseSSL && cfg.databaseType == "postgresql") -> (cfg.databaseCaCert != null);
message = ''A CA certificate must be specified (in 'services.keycloak.databaseCaCert') when PostgreSQL is used with SSL'';
}
];
environment.systemPackages = [ cfg.package ];
systemd.services.keycloakPostgreSQLInit = lib.mkIf createLocalPostgreSQL {
after = [ "postgresql.service" ];
before = [ "keycloak.service" ];
bindsTo = [ "postgresql.service" ];
serviceConfig = {
Type = "oneshot";
RemainAfterExit = true;
User = "postgres";
Group = "postgres";
};
script = ''
set -eu
PSQL=${config.services.postgresql.package}/bin/psql
db_password="$(<'${cfg.databasePasswordFile}')"
$PSQL -tAc "SELECT 1 FROM pg_roles WHERE rolname='keycloak'" | grep -q 1 || $PSQL -tAc "CREATE ROLE keycloak WITH LOGIN PASSWORD '$db_password' CREATEDB"
$PSQL -tAc "SELECT 1 FROM pg_database WHERE datname = 'keycloak'" | grep -q 1 || $PSQL -tAc 'CREATE DATABASE "keycloak" OWNER "keycloak"'
'';
};
systemd.services.keycloakMySQLInit = lib.mkIf createLocalMySQL {
after = [ "mysql.service" ];
before = [ "keycloak.service" ];
bindsTo = [ "mysql.service" ];
serviceConfig = {
Type = "oneshot";
RemainAfterExit = true;
User = config.services.mysql.user;
Group = config.services.mysql.group;
};
script = ''
set -eu
db_password="$(<'${cfg.databasePasswordFile}')"
( echo "CREATE USER IF NOT EXISTS 'keycloak'@'localhost' IDENTIFIED BY '$db_password';"
echo "CREATE DATABASE keycloak CHARACTER SET utf8 COLLATE utf8_unicode_ci;"
echo "GRANT ALL PRIVILEGES ON keycloak.* TO 'keycloak'@'localhost';"
) | ${config.services.mysql.package}/bin/mysql -N
'';
};
systemd.services.keycloak =
let
databaseServices =
if createLocalPostgreSQL then [
"keycloakPostgreSQLInit.service" "postgresql.service"
]
else if createLocalMySQL then [
"keycloakMySQLInit.service" "mysql.service"
]
else [ ];
in {
after = databaseServices;
bindsTo = databaseServices;
wantedBy = [ "multi-user.target" ];
environment = {
JBOSS_LOG_DIR = "/var/log/keycloak";
JBOSS_BASE_DIR = "/run/keycloak";
JBOSS_MODULEPATH = "${cfg.package}/modules";
};
serviceConfig = {
ExecStartPre = let
startPreFullPrivileges = ''
set -eu
install -T -m 0400 -o keycloak -g keycloak '${cfg.databasePasswordFile}' /run/keycloak/secrets/db_password
'' + lib.optionalString (cfg.certificatePrivateKeyBundle != null) ''
install -T -m 0400 -o keycloak -g keycloak '${cfg.certificatePrivateKeyBundle}' /run/keycloak/secrets/ssl_cert_pk_bundle
'';
startPre = ''
set -eu
install -m 0600 ${cfg.package}/standalone/configuration/*.properties /run/keycloak/configuration
install -T -m 0600 ${keycloakConfig} /run/keycloak/configuration/standalone.xml
db_password="$(</run/keycloak/secrets/db_password)"
${pkgs.replace}/bin/replace-literal -fe '@db-password@' "$db_password" /run/keycloak/configuration/standalone.xml
export JAVA_OPTS=-Djboss.server.config.user.dir=/run/keycloak/configuration
${cfg.package}/bin/add-user-keycloak.sh -u admin -p '${cfg.initialAdminPassword}'
'' + lib.optionalString (cfg.certificatePrivateKeyBundle != null) ''
pushd /run/keycloak/ssl/
cat /run/keycloak/secrets/ssl_cert_pk_bundle <(echo) /etc/ssl/certs/ca-certificates.crt > allcerts.pem
${pkgs.openssl}/bin/openssl pkcs12 -export -in /run/keycloak/secrets/ssl_cert_pk_bundle -chain \
-name "${cfg.frontendUrl}" -out certificate_private_key_bundle.p12 \
-CAfile allcerts.pem -passout pass:notsosecretpassword
popd
'';
in [
"+${pkgs.writeShellScript "keycloak-start-pre-full-privileges" startPreFullPrivileges}"
"${pkgs.writeShellScript "keycloak-start-pre" startPre}"
];
ExecStart = "${cfg.package}/bin/standalone.sh";
User = "keycloak";
Group = "keycloak";
DynamicUser = true;
RuntimeDirectory = map (p: "keycloak/" + p) [
"secrets"
"configuration"
"deployments"
"data"
"ssl"
"log"
"tmp"
];
RuntimeDirectoryMode = 0700;
LogsDirectory = "keycloak";
AmbientCapabilities = "CAP_NET_BIND_SERVICE";
};
};
services.postgresql.enable = lib.mkDefault createLocalPostgreSQL;
services.mysql.enable = lib.mkDefault createLocalMySQL;
services.mysql.package = lib.mkIf createLocalMySQL pkgs.mysql;
};
meta.doc = ./keycloak.xml;
}

View File

@ -0,0 +1,205 @@
<chapter xmlns="http://docbook.org/ns/docbook"
xmlns:xlink="http://www.w3.org/1999/xlink"
xmlns:xi="http://www.w3.org/2001/XInclude"
version="5.0"
xml:id="module-services-keycloak">
<title>Keycloak</title>
<para>
<link xlink:href="https://www.keycloak.org/">Keycloak</link> is an
open source identity and access management server with support for
<link xlink:href="https://openid.net/connect/">OpenID
Connect</link>, <link xlink:href="https://oauth.net/2/">OAUTH
2.0</link> and <link
xlink:href="https://en.wikipedia.org/wiki/SAML_2.0">SAML
2.0</link>.
</para>
<section xml:id="module-services-keycloak-admin">
<title>Administration</title>
<para>
An administrative user with the username
<literal>admin</literal> is automatically created in the
<literal>master</literal> realm. Its initial password can be
configured by setting <xref linkend="opt-services.keycloak.initialAdminPassword" />
and defaults to <literal>changeme</literal>. The password is
not stored safely and should be changed immediately in the
admin panel.
</para>
<para>
Refer to the <link
xlink:href="https://www.keycloak.org/docs/latest/server_admin/index.html#admin-console">Admin
Console section of the Keycloak Server Administration Guide</link> for
information on how to administer your
<productname>Keycloak</productname> instance.
</para>
</section>
<section xml:id="module-services-keycloak-database">
<title>Database access</title>
<para>
<productname>Keycloak</productname> can be used with either
<productname>PostgreSQL</productname> or
<productname>MySQL</productname>. Which one is used can be
configured in <xref
linkend="opt-services.keycloak.databaseType" />. The selected
database will automatically be enabled and a database and role
created unless <xref
linkend="opt-services.keycloak.databaseHost" /> is changed from
its default of <literal>localhost</literal> or <xref
linkend="opt-services.keycloak.databaseCreateLocally" /> is set
to <literal>false</literal>.
</para>
<para>
External database access can also be configured by setting
<xref linkend="opt-services.keycloak.databaseHost" />, <xref
linkend="opt-services.keycloak.databaseUsername" />, <xref
linkend="opt-services.keycloak.databaseUseSSL" /> and <xref
linkend="opt-services.keycloak.databaseCaCert" /> as
appropriate. Note that you need to manually create a database
called <literal>keycloak</literal> and allow the configured
database user full access to it.
</para>
<para>
<xref linkend="opt-services.keycloak.databasePasswordFile" />
must be set to the path to a file containing the password used
to log in to the database. If <xref linkend="opt-services.keycloak.databaseHost" />
and <xref linkend="opt-services.keycloak.databaseCreateLocally" />
are kept at their defaults, the database role
<literal>keycloak</literal> with that password is provisioned
on the local database instance.
</para>
<warning>
<para>
The path should be provided as a string, not a Nix path, since Nix
paths are copied into the world readable Nix store.
</para>
</warning>
</section>
<section xml:id="module-services-keycloak-frontendurl">
<title>Frontend URL</title>
<para>
The frontend URL is used as base for all frontend requests and
must be configured through <xref linkend="opt-services.keycloak.frontendUrl" />.
It should normally include a trailing <literal>/auth</literal>
(the default web context).
</para>
<para>
<xref linkend="opt-services.keycloak.forceBackendUrlToFrontendUrl" />
determines whether Keycloak should force all requests to go
through the frontend URL. By default,
<productname>Keycloak</productname> allows backend requests to
instead use its local hostname or IP address and may also
advertise it to clients through its OpenID Connect Discovery
endpoint.
</para>
<para>
See the <link
xlink:href="https://www.keycloak.org/docs/latest/server_installation/#_hostname">Hostname
section of the Keycloak Server Installation and Configuration
Guide</link> for more information.
</para>
</section>
<section xml:id="module-services-keycloak-tls">
<title>Setting up TLS/SSL</title>
<para>
By default, <productname>Keycloak</productname> won't accept
unsecured HTTP connections originating from outside its local
network.
</para>
<para>
For HTTPS support, a TLS certificate and private key is
required. They should be <link
xlink:href="https://en.wikipedia.org/wiki/Privacy-Enhanced_Mail">PEM
formatted</link> and concatenated into a single file. The path
to this file should be configured in
<xref linkend="opt-services.keycloak.certificatePrivateKeyBundle" />.
</para>
<warning>
<para>
The path should be provided as a string, not a Nix path,
since Nix paths are copied into the world readable Nix store.
</para>
</warning>
</section>
<section xml:id="module-services-keycloak-extra-config">
<title>Additional configuration</title>
<para>
Additional Keycloak configuration options, for which no
explicit <productname>NixOS</productname> options are provided,
can be set in <xref linkend="opt-services.keycloak.extraConfig" />.
</para>
<para>
Options are expressed as a Nix attribute set which matches the
structure of the jboss-cli configuration. The configuration is
effectively overlayed on top of the default configuration
shipped with Keycloak. To remove existing nodes and undefine
attributes from the default configuration, set them to
<literal>null</literal>.
</para>
<para>
For example, the following script, which removes the hostname
provider <literal>default</literal>, adds the deprecated
hostname provider <literal>fixed</literal> and defines it the
default:
<programlisting>
/subsystem=keycloak-server/spi=hostname/provider=default:remove()
/subsystem=keycloak-server/spi=hostname/provider=fixed:add(enabled = true, properties = { hostname = "keycloak.example.com" })
/subsystem=keycloak-server/spi=hostname:write-attribute(name=default-provider, value="fixed")
</programlisting>
would be expressed as
<programlisting>
services.keycloak.extraConfig = {
"subsystem=keycloak-server" = {
"spi=hostname" = {
"provider=default" = null;
"provider=fixed" = {
enabled = true;
properties.hostname = "keycloak.example.com";
};
default-provider = "fixed";
};
};
};
</programlisting>
</para>
<para>
You can discover available options by using the <link
xlink:href="http://docs.wildfly.org/21/Admin_Guide.html#Command_Line_Interface">jboss-cli.sh</link>
program and by referring to the <link
xlink:href="https://www.keycloak.org/docs/latest/server_installation/index.html">Keycloak
Server Installation and Configuration Guide</link>.
</para>
</section>
<section xml:id="module-services-keycloak-example-config">
<title>Example configuration</title>
<para>
A basic configuration with some custom settings could look like this:
<programlisting>
services.keycloak = {
<link linkend="opt-services.keycloak.enable">enable</link> = true;
<link linkend="opt-services.keycloak.initialAdminPassword">initialAdminPassword</link> = "e6Wcm0RrtegMEHl"; # change on first login
<link linkend="opt-services.keycloak.frontendUrl">frontendUrl</link> = "https://keycloak.example.com/auth";
<link linkend="opt-services.keycloak.forceBackendUrlToFrontendUrl">forceBackendUrlToFrontendUrl</link> = true;
<link linkend="opt-services.keycloak.certificatePrivateKeyBundle">certificatePrivateKeyBundle</link> = "/run/keys/ssl_cert";
<link linkend="opt-services.keycloak.databasePasswordFile">databasePasswordFile</link> = "/run/keys/db_password";
};
</programlisting>
</para>
</section>
</chapter>

View File

@ -175,6 +175,7 @@ in
kernel-latest = handleTest ./kernel-latest.nix {};
kernel-lts = handleTest ./kernel-lts.nix {};
kernel-testing = handleTest ./kernel-testing.nix {};
keycloak = discoverTests (import ./keycloak.nix);
keymap = handleTest ./keymap.nix {};
knot = handleTest ./knot.nix {};
krb5 = discoverTests (import ./krb5 {});

View File

@ -22,6 +22,10 @@ import ../make-test-python.nix ({ lib, ... }:
hostKeys = [ ./ssh_host_ed25519_key ];
};
};
boot.initrd.extraUtilsCommands = ''
mkdir -p $out/secrets/etc/ssh
cat "${./ssh_host_ed25519_key}" > $out/secrets/etc/ssh/sh_host_ed25519_key
'';
boot.initrd.preLVMCommands = ''
while true; do
if [ -f fnord ]; then

144
nixos/tests/keycloak.nix Normal file
View File

@ -0,0 +1,144 @@
# This tests Keycloak: it starts the service, creates a realm with an
# OIDC client and a user, and simulates the user logging in to the
# client using their Keycloak login.
let
frontendUrl = "http://keycloak/auth";
initialAdminPassword = "h4IhoJFnt2iQIR9";
keycloakTest = import ./make-test-python.nix (
{ pkgs, databaseType, ... }:
{
name = "keycloak";
meta = with pkgs.stdenv.lib.maintainers; {
maintainers = [ talyz ];
};
nodes = {
keycloak = { ... }: {
virtualisation.memorySize = 1024;
services.keycloak = {
enable = true;
inherit frontendUrl databaseType initialAdminPassword;
databasePasswordFile = pkgs.writeText "dbPassword" "wzf6vOCbPp6cqTH";
};
environment.systemPackages = with pkgs; [
xmlstarlet
libtidy
jq
];
};
};
testScript =
let
client = {
clientId = "test-client";
name = "test-client";
redirectUris = [ "urn:ietf:wg:oauth:2.0:oob" ];
};
user = {
firstName = "Chuck";
lastName = "Testa";
username = "chuck.testa";
email = "chuck.testa@example.com";
};
password = "password1234";
realm = {
enabled = true;
realm = "test-realm";
clients = [ client ];
users = [(
user // {
enabled = true;
credentials = [{
type = "password";
temporary = false;
value = password;
}];
}
)];
};
realmDataJson = pkgs.writeText "realm-data.json" (builtins.toJSON realm);
jqCheckUserinfo = pkgs.writeText "check-userinfo.jq" ''
if {
"firstName": .given_name,
"lastName": .family_name,
"username": .preferred_username,
"email": .email
} != ${builtins.toJSON user} then
error("Wrong user info!")
else
empty
end
'';
in ''
keycloak.start()
keycloak.wait_for_unit("keycloak.service")
keycloak.wait_until_succeeds("curl -sSf ${frontendUrl}")
### Realm Setup ###
# Get an admin interface access token
keycloak.succeed(
"curl -sSf -d 'client_id=admin-cli' -d 'username=admin' -d 'password=${initialAdminPassword}' -d 'grant_type=password' '${frontendUrl}/realms/master/protocol/openid-connect/token' | jq -r '\"Authorization: bearer \" + .access_token' >admin_auth_header"
)
# Publish the realm, including a test OIDC client and user
keycloak.succeed(
"curl -sSf -H @admin_auth_header -X POST -H 'Content-Type: application/json' -d @${realmDataJson} '${frontendUrl}/admin/realms/'"
)
# Generate and save the client secret. To do this we need
# Keycloak's internal id for the client.
keycloak.succeed(
"curl -sSf -H @admin_auth_header '${frontendUrl}/admin/realms/${realm.realm}/clients?clientId=${client.name}' | jq -r '.[].id' >client_id",
"curl -sSf -H @admin_auth_header -X POST '${frontendUrl}/admin/realms/${realm.realm}/clients/'$(<client_id)'/client-secret' | jq -r .value >client_secret",
)
### Authentication Testing ###
# Start the login process by sending an initial request to the
# OIDC authentication endpoint, saving the returned page. Tidy
# up the HTML (XmlStarlet is picky) and extract the login form
# post url.
keycloak.succeed(
"curl -sSf -c cookie '${frontendUrl}/realms/${realm.realm}/protocol/openid-connect/auth?client_id=${client.name}&redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&scope=openid+email&response_type=code&response_mode=query&nonce=qw4o89g3qqm' >login_form",
"tidy -q -m login_form || true",
"xml sel -T -t -m \"_:html/_:body/_:div/_:div/_:div/_:div/_:div/_:div/_:form[@id='kc-form-login']\" -v @action login_form >form_post_url",
)
# Post the login form and save the response. Once again tidy up
# the HTML, then extract the authorization code.
keycloak.succeed(
"curl -sSf -L -b cookie -d 'username=${user.username}' -d 'password=${password}' -d 'credentialId=' \"$(<form_post_url)\" >auth_code_html",
"tidy -q -m auth_code_html || true",
"xml sel -T -t -m \"_:html/_:body/_:div/_:div/_:div/_:div/_:div/_:input[@id='code']\" -v @value auth_code_html >auth_code",
)
# Exchange the authorization code for an access token.
keycloak.succeed(
"curl -sSf -d grant_type=authorization_code -d code=$(<auth_code) -d client_id=${client.name} -d client_secret=$(<client_secret) -d redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob '${frontendUrl}/realms/${realm.realm}/protocol/openid-connect/token' | jq -r '\"Authorization: bearer \" + .access_token' >auth_header"
)
# Use the access token on the OIDC userinfo endpoint and check
# that the returned user info matches what we initialized the
# realm with.
keycloak.succeed(
"curl -sSf -H @auth_header '${frontendUrl}/realms/${realm.realm}/protocol/openid-connect/userinfo' | jq -f ${jqCheckUserinfo}"
)
'';
}
);
in
{
postgres = keycloakTest { databaseType = "postgresql"; };
mysql = keycloakTest { databaseType = "mysql"; };
}

View File

@ -12,15 +12,28 @@ import ./make-test-python.nix ({ lib, pkgs, ... }:
enable = true;
configFile = "${pkgs.grafana-loki.src}/cmd/loki/loki-local-config.yaml";
};
systemd.services.promtail = {
description = "Promtail service for Loki test";
wantedBy = [ "multi-user.target" ];
serviceConfig = {
ExecStart = ''
${pkgs.grafana-loki}/bin/promtail --config.file ${pkgs.grafana-loki.src}/cmd/promtail/promtail-local-config.yaml
'';
DynamicUser = true;
services.promtail = {
enable = true;
configuration = {
server = {
http_listen_port = 9080;
grpc_listen_port = 0;
};
clients = [ { url = "http://localhost:3100/loki/api/v1/push"; } ];
scrape_configs = [
{
job_name = "system";
static_configs = [
{
targets = [ "localhost" ];
labels = {
job = "varlogs";
__path__ = "/var/log/*log";
};
}
];
}
];
};
};
};

View File

@ -9,13 +9,13 @@
stdenv.mkDerivation rec {
pname = "ft2-clone";
version = "1.36";
version = "1.37";
src = fetchFromGitHub {
owner = "8bitbubsy";
repo = "ft2-clone";
rev = "v${version}";
sha256 = "0hsgzh7s2qgl8ah8hzmhfl74v5y8wc7f6z8ly9026h5r6pb09id0";
sha256 = "1lhpzd46mpr3bq13qhd0bq724db5fhc8jplfb684c2q7sc4v92nk";
};
nativeBuildInputs = [ cmake ];

View File

@ -10,13 +10,13 @@ assert pcreSupport -> pcre != null;
stdenv.mkDerivation rec {
pname = "ncmpc";
version = "0.39";
version = "0.42";
src = fetchFromGitHub {
owner = "MusicPlayerDaemon";
repo = "ncmpc";
rev = "v${version}";
sha256 = "08xrcinfm1a7hjycf8la7gnsxbp3six70ks987dr7j42kd42irfq";
sha256 = "1c21sbdm6pp3kwhnzc7c6ksna7madvsmfa7j91as2g8485symqv2";
};
buildInputs = [ glib ncurses mpd_clientlib boost ]

View File

@ -3,13 +3,13 @@
stdenv.mkDerivation rec {
pname = "ncpamixer";
version = "1.3.3";
version = "1.3.3.1";
src = fetchFromGitHub {
owner = "fulhax";
repo = "ncpamixer";
rev = version;
sha256 = "19pxfvfhhrbfk1wz5awx60y51jccrgrcvlq7lb622sw2z0wzw4ac";
sha256 = "1v3bz0vpgh18257hdnz3yvbnl51779g1h5b265zgc21ks7m1jw5z";
};
buildInputs = [ ncurses libpulseaudio ];

View File

@ -12,13 +12,13 @@ let
;
in pythonPackages.buildPythonApplication rec {
pname = "picard";
version = "2.5";
version = "2.5.1";
src = fetchFromGitHub {
owner = "metabrainz";
repo = pname;
rev = "release-${version}";
sha256 = "02px6r086pyhpf6wia876c73bgr4xa4pyx2yykv6j74zyp5wig3z";
sha256 = "13q926iqwdba6ds5s3ir57c9bkg8gcv6dhqvhmg00fnzkq9xqk3d";
};
nativeBuildInputs = [ gettext qt5.wrapQtAppsHook qt5.qtbase ]

View File

@ -0,0 +1,34 @@
{ mkDerivation
, stdenv
, fetchFromGitHub
, qmake
, qtbase
, qtmultimedia
, libvorbis
}:
mkDerivation rec {
pname = "ptcollab";
version = "0.3.4.1";
src = fetchFromGitHub {
owner = "yuxshao";
repo = "ptcollab";
rev = "v${version}";
sha256 = "0rjyhxfad864w84n0bxyhc1jjxhzwwdx26r6psba2582g90cv024";
};
nativeBuildInputs = [ qmake ];
buildInputs = [ qtbase qtmultimedia libvorbis ];
meta = with stdenv.lib; {
description = "Experimental pxtone editor where you can collaborate with friends";
homepage = "https://yuxshao.github.io/ptcollab/";
license = licenses.mit;
maintainers = with maintainers; [ OPNA2608 ];
platforms = platforms.all;
# Requires Qt5.15
broken = stdenv.hostPlatform.isDarwin;
};
}

View File

@ -85,22 +85,6 @@ stdenv.lib.makeScope pkgs.newScope (self: with self; {
};
};
focusblur = pluginDerivation rec {
/* menu:
Blur/Focus Blur
*/
name = "focusblur-3.2.6";
buildInputs = with pkgs; [ fftwSinglePrec ];
patches = [ ./patches/focusblur-glib.patch ];
postInstall = "fail";
installPhase = "installPlugins src/focusblur";
src = fetchurl {
url = "http://registry.gimp.org/files/${name}.tar.bz2";
sha256 = "1gqf3hchz7n7v5kpqkhqh8kwnxbsvlb5cr2w2n7ngrvl56f5xs1h";
};
meta.broken = true;
};
resynthesizer = pluginDerivation rec {
/* menu:
Edit/Fill with pattern seamless...

View File

@ -10,11 +10,11 @@
mkDerivation rec {
pname = "krita";
version = "4.4.0";
version = "4.4.1";
src = fetchurl {
url = "https://download.kde.org/stable/${pname}/${version}/${pname}-${version}.tar.xz";
sha256 = "0ydmxql8iym62q0nqwn9mnb94jz1nh84i6bni0mgzwjk8p4zfzw3";
sha256 = "1bmmfvmawnlihbqkksdrwxfkaip4nfsi97w83fmvkyxl4jk715vr";
};
# *somtimes* fails with can't find ui_manager.h, also see https://github.com/NixOS/nixpkgs/issues/35359

View File

@ -7,7 +7,7 @@
stdenv.mkDerivation rec {
pname = "dbeaver-ce";
version = "7.2.3";
version = "7.2.4";
desktopItem = makeDesktopItem {
name = "dbeaver";
@ -30,7 +30,7 @@ stdenv.mkDerivation rec {
src = fetchurl {
url = "https://dbeaver.io/files/${version}/dbeaver-ce-${version}-linux.gtk.x86_64.tar.gz";
sha256 = "sha256-XYAe+e9zK/fvxBJ2Caz9/95++JzIQykXj8953IocDZU=";
sha256 = "sha256-RsXLznTz/U23e77xzyINi8HVuGqR4TrPaf+w++zPOH4=";
};
installPhase = ''

View File

@ -1,20 +1,35 @@
{ stdenv, python, fetchpatch }:
{ stdenv, fetchFromGitHub, python3, fetchpatch }:
with python.pkgs;
let
py = python3.override {
packageOverrides = self: super: {
self = py;
# not compatible with prompt_toolkit >=2.0
prompt_toolkit = super.prompt_toolkit.overridePythonAttrs (oldAttrs: rec {
name = "${oldAttrs.pname}-${version}";
version = "1.0.18";
src = oldAttrs.src.override {
inherit version;
sha256 = "09h1153wgr5x2ny7ds0w2m81n3bb9j8hjb8sjfnrg506r01clkyx";
};
});
};
};
in
with py.pkgs;
buildPythonApplication rec {
pname = "haxor-news";
version = "0.4.3";
version = "unstable-2020-10-20";
src = fetchPypi {
inherit pname version;
sha256 = "5b9af8338a0f8b95a8133b66ef106553823813ac171c0aefa3f3f2dbeb4d7f88";
};
# allow newer click version
patches = fetchpatch {
url = "${meta.homepage}/commit/5b0d3ef1775756ca15b6d83fba1fb751846b5427.patch";
sha256 = "1551knh2f7yarqzcpip16ijmbx8kzdna8cihxlxx49ww55f5sg67";
# haven't done a stable release in 3+ years, but actively developed
src = fetchFromGitHub {
owner = "donnemartin";
repo = pname;
rev = "811a5804c09406465b2b02eab638c08bf5c4fa7f";
sha256 = "1g3dfsyk4727d9jh9w6j5r51ag07851cls7v7a7hmdvdixpvbzp6";
};
propagatedBuildInputs = [
@ -26,6 +41,7 @@ buildPythonApplication rec {
six
];
# will fail without pre-seeded config files
doCheck = false;
checkInputs = [ mock ];

View File

@ -2,16 +2,16 @@
buildGoModule rec {
pname = "hugo";
version = "0.77.0";
version = "0.78.0";
src = fetchFromGitHub {
owner = "gohugoio";
repo = pname;
rev = "v${version}";
sha256 = "1vjqddcbk8afqkjzrj9wwvz697bxhv9vz0rk2vj2ji6lz1slhc56";
sha256 = "0la1c6yj9dq9rqxk6m8n8l4cabgzlk0r3was8mvgd80g3x3zn55v";
};
vendorSha256 = "03xv188jw5scqd6a8xd2s13vkn721d37bgs6a6rik7pgqmjh46c6";
vendorSha256 = "09fvvs85rvvh0z4px2bj5908xf1mrcslkzsz09p0gy5i3zaqfnp9";
doCheck = false;

View File

@ -30,12 +30,12 @@ let
in stdenv.mkDerivation rec {
pname = "obsidian";
version = "0.9.4";
version = "0.9.6";
src = fetchurl {
url =
"https://github.com/obsidianmd/obsidian-releases/releases/download/v${version}/obsidian-${version}.asar.gz";
sha256 = "0qahgm9gf4sap28wy7cxbf41h8zldplbwxnv8shyajbkxn108g5p";
sha256 = "1n8qc8ssv93xcal9fgbwvkvahzwyn6367v8gbxgc3036l66mira7";
};
nativeBuildInputs = [ makeWrapper graphicsmagick ];

View File

@ -316,7 +316,12 @@ let
patchelf --set-rpath "${libGL}/lib:$origRpath" "$chromiumBinary"
'';
passthru.updateScript = ./update.py;
passthru = {
updateScript = ./update.py;
chromiumDeps = {
gn = gnChromium;
};
};
};
# Remove some extraAttrs we supplied to the base attributes already.

View File

@ -35,26 +35,15 @@ let
mkChromiumDerivation = callPackage ./common.nix ({
inherit channel gnome gnomeSupport gnomeKeyringSupport proprietaryCodecs
cupsSupport pulseSupport useOzone;
# TODO: Remove after we can update gn for the stable channel (backward incompatible changes):
gnChromium = gn.overrideAttrs (oldAttrs: {
version = "2020-07-20";
inherit (upstream-info.deps.gn) version;
src = fetchgit {
url = "https://gn.googlesource.com/gn";
rev = "3028c6a426a4aaf6da91c4ebafe716ae370225fe";
sha256 = "0h3wf4152zdvrbb0jbj49q6814lfl3rcy5mj8b2pl9s0ahvkbc6q";
inherit (upstream-info.deps.gn) url rev sha256;
};
});
} // lib.optionalAttrs (lib.versionAtLeast upstream-info.version "87") {
useOzone = true; # YAY: https://chromium-review.googlesource.com/c/chromium/src/+/2382834 \o/
useVaapi = !stdenv.isAarch64; # TODO: Might be best to not set use_vaapi anymore (default is fine)
gnChromium = gn.overrideAttrs (oldAttrs: {
version = "2020-08-17";
src = fetchgit {
url = "https://gn.googlesource.com/gn";
rev = "6f13aaac55a977e1948910942675c69f2b4f7a94";
sha256 = "01hpma1sllpdx09mvr4d6073sg6zmk6iv44kd3r28khymcj4s251";
};
});
});
browser = callPackage ./browser.nix { inherit channel enableWideVine; };

View File

@ -1,13 +1,15 @@
#! /usr/bin/env nix-shell
#! nix-shell -i python -p python3 nix
#! nix-shell -i python -p python3 nix nix-prefetch-git
import csv
import json
import re
import subprocess
import sys
from codecs import iterdecode
from collections import OrderedDict
from datetime import datetime
from os.path import abspath, dirname
from urllib.request import urlopen
@ -26,6 +28,30 @@ def nix_prefetch_url(url, algo='sha256'):
out = subprocess.check_output(['nix-prefetch-url', '--type', algo, url])
return out.decode('utf-8').rstrip()
def nix_prefetch_git(url, rev):
print(f'nix-prefetch-git {url} {rev}')
out = subprocess.check_output(['nix-prefetch-git', '--quiet', '--url', url, '--rev', rev])
return json.loads(out)
def get_file_revision(revision, file_path):
url = f'https://raw.githubusercontent.com/chromium/chromium/{revision}/{file_path}'
with urlopen(url) as http_response:
return http_response.read()
def get_channel_dependencies(channel):
deps = get_file_revision(channel['version'], 'DEPS')
gn_pattern = b"'gn_version': 'git_revision:([0-9a-f]{40})'"
gn_commit = re.search(gn_pattern, deps).group(1).decode()
gn = nix_prefetch_git('https://gn.googlesource.com/gn', gn_commit)
return {
'gn': {
'version': datetime.fromisoformat(gn['date']).date().isoformat(),
'url': gn['url'],
'rev': gn['rev'],
'sha256': gn['sha256']
}
}
channels = {}
last_channels = load_json(JSON_PATH)
@ -58,6 +84,8 @@ with urlopen(HISTORY_URL) as resp:
# the next one.
continue
channel['deps'] = get_channel_dependencies(channel)
channels[channel_name] = channel
with open(JSON_PATH, 'w') as out:

View File

@ -1,17 +1,41 @@
{
"stable": {
"version": "86.0.4240.111",
"sha256": "05y7lwr89awkhvgmwkx3br9j4ap2aypg2wsc0nz8mi7kxc1dnyzj",
"sha256bin64": "10aqiiydw4i3jxnw8xxdgkgcqbfqc67n1fbrg40y54kg0v5dz8l6"
"version": "86.0.4240.183",
"sha256": "1g39i82js7fm4fqb8i66d6xs0kzqjxzi4vzvvwz5y9rkbikcc4ma",
"sha256bin64": "1r0dxqsx6j19hgwr3v2sdlb2vd7gb961c4wba4ymd8wy8j8pzly9",
"deps": {
"gn": {
"version": "2020-08-07",
"url": "https://gn.googlesource.com/gn",
"rev": "e327ffdc503815916db2543ec000226a8df45163",
"sha256": "0kvlfj3www84zp1vmxh76x8fdjm9hyk8lkh2vdsidafpmm75fphr"
}
}
},
"beta": {
"version": "87.0.4280.27",
"sha256": "0w0asxj7jlsw69cssfia8km4q9cx1c2mliks2rmhf4jk0hsghasm",
"sha256bin64": "1lsx4mhy8nachfb8c9f3mrx5nqw2bi046dqirb4lnv7y80jjjs1k"
"version": "87.0.4280.40",
"sha256": "07xh76fl257np68way6i5rf64qbvirkfddy7m5gvqb0fzcqd7dp3",
"sha256bin64": "1b2z0aqlh28pqrk6dmabxp1d4mvp9iyfmi4kqmns4cdpg0qgaf41",
"deps": {
"gn": {
"version": "2020-09-09",
"url": "https://gn.googlesource.com/gn",
"rev": "e002e68a48d1c82648eadde2f6aafa20d08c36f2",
"sha256": "0x4c7amxwzxs39grqs3dnnz0531mpf1p75niq7zhinyfqm86i4dk"
}
}
},
"dev": {
"version": "88.0.4298.4",
"sha256": "0ka11gmpkyrmifajaxm66c16hrj3xakdvhjqg04slyp2sv0nlhrl",
"sha256bin64": "0768y31jqbl1znp7yp6mvl5j12xl1nwjkh2l8zdga81q0wz52hh6"
"version": "88.0.4300.0",
"sha256": "00cfs2rp4h8ybn2snr1d8ygg635hx7q5gv2aqriy1j6f8a1pgh1b",
"sha256bin64": "110r1m14h91212nx6pfhn8wkics7wlwx1608l5cqsxxcpvpzl3pv",
"deps": {
"gn": {
"version": "2020-09-09",
"url": "https://gn.googlesource.com/gn",
"rev": "e002e68a48d1c82648eadde2f6aafa20d08c36f2",
"sha256": "0x4c7amxwzxs39grqs3dnnz0531mpf1p75niq7zhinyfqm86i4dk"
}
}
}
}

View File

@ -409,5 +409,6 @@ stdenv.mkDerivation rec {
# the compound is "libre" in a strict sense (some components place certain
# restrictions on redistribution), it's free enough for our purposes.
license = licenses.free;
broken = true;
};
}

View File

@ -2,8 +2,8 @@
buildGoModule rec {
pname = "kube3d";
version = "3.0.2";
k3sVersion = "1.18.6-k3s1";
version = "3.1.5";
k3sVersion = "1.18.9-k3s1";
excludedPackages = ''tools'';
@ -11,7 +11,7 @@ buildGoModule rec {
owner = "rancher";
repo = "k3d";
rev = "v${version}";
sha256 = "182n4kggwr6z75vsagfd0rl89ixcw5h13whf56jh4cd38dj8is5l";
sha256 = "0aspkar9im323d8117k48fvh1yylyspi2p2l2f5rdg1ilpa6hm53";
};
buildFlagsArray = ''
@ -22,11 +22,13 @@ buildGoModule rec {
'';
nativeBuildInputs = [ installShellFiles ];
# TODO: Move to enhanced installShellCompletion when in master: PR #83630
postInstall = ''
for shell in bash zsh; do
$out/bin/k3d completion $shell > k3d.$shell
installShellCompletion k3d.$shell
done
$out/bin/k3d completion bash > k3d.bash
$out/bin/k3d completion fish > k3d.fish
$out/bin/k3d completion zsh > _k3d
installShellCompletion k3d.{bash,fish} --zsh _k3d
'';
vendorSha256 = null;
@ -38,6 +40,6 @@ buildGoModule rec {
description = "A helper to run k3s (Lightweight Kubernetes. 5 less than k8s) in a docker container";
license = licenses.mit;
platforms = platforms.linux;
maintainers = with maintainers; [ kuznero jlesquembre ngerstle ];
maintainers = with maintainers; [ kuznero jlesquembre ngerstle jk ];
};
}

View File

@ -164,6 +164,7 @@ let
cloudfoundry = callPackage ./cloudfoundry {};
elasticsearch = callPackage ./elasticsearch {};
gandi = callPackage ./gandi {};
hcloud = callPackage ./hcloud {};
keycloak = callPackage ./keycloak {};
libvirt = callPackage ./libvirt {};
lxd = callPackage ./lxd {};

View File

@ -0,0 +1,32 @@
{ stdenv, buildGoModule, fetchFromGitHub }:
buildGoModule rec {
pname = "terraform-provider-hcloud";
version = "1.22.0";
src = fetchFromGitHub {
owner = "hetznercloud";
repo = pname;
rev = "v${version}";
sha256 = "1h4kplrmpsbwa0nq3zyqa0cnvhv1s5avdrjyf1k1f2z6b6h4gynf";
};
vendorSha256 = "070p34g0ca55rmfdwf1l53yr8vyhmm5sb8hm8q036n066yp03yfs";
# Spends an awful time in other test folders, apparently tries to reach
# opencensus and fails.
checkPhase = ''
pushd hcloud
go test -v
popd
'';
postInstall = "mv $out/bin/terraform-provider-hcloud{,_v${version}}";
meta = with stdenv.lib; {
homepage = "https://github.com/cloudfoundry-community/terraform-provider-cloudfoundry";
description = "Terraform provider for cloudfoundry";
license = licenses.mpl20;
maintainers = with maintainers; [ ris ];
};
}

View File

@ -33,9 +33,9 @@
"owner": "hashicorp",
"provider-source-address": "registry.terraform.io/hashicorp/archive",
"repo": "terraform-provider-archive",
"rev": "v1.3.0",
"sha256": "1hwg8ai4bvsmgnl669608lr4v940xnyig1xshps490f47c8hqy6y",
"version": "1.3.0"
"rev": "v2.0.0",
"sha256": "1d5n379zyjp2srg43g78a8h33qwcpkfkj7c35idvbyydi35vzlpl",
"version": "2.0.0"
},
"arukas": {
"owner": "terraform-providers",
@ -293,9 +293,9 @@
"owner": "hashicorp",
"provider-source-address": "registry.terraform.io/hashicorp/external",
"repo": "terraform-provider-external",
"rev": "v1.2.0",
"sha256": "1kx28bffhd1pg3m0cbldclc8l9zic16mqrk7gybcls9vyds5gbvc",
"version": "1.2.0"
"rev": "v2.0.0",
"sha256": "16wciz08gicicsirij2ql0gy8dg0372jjsqmaigkl2n07mqz2b6a",
"version": "2.0.0"
},
"fastly": {
"owner": "terraform-providers",
@ -367,13 +367,6 @@
"sha256": "00l3cwvyyjk0n3j535qfj3bsf1s5l07786gnxycj0f8vz3a06bcq",
"version": "1.6.0"
},
"hcloud": {
"owner": "terraform-providers",
"repo": "terraform-provider-hcloud",
"rev": "v1.16.0",
"sha256": "09v2bg4ffyh4ibz449dygxgd7mvjgh4b2r242l3cwi7pzn66imrz",
"version": "1.16.0"
},
"hedvig": {
"owner": "terraform-providers",
"repo": "terraform-provider-hedvig",
@ -513,9 +506,9 @@
"owner": "hashicorp",
"provider-source-address": "registry.terraform.io/hashicorp/local",
"repo": "terraform-provider-local",
"rev": "v1.4.0",
"sha256": "1k1kbdn99ypn1pi6vqbs1l9a8vvf4vs32wl8waa16i26514sz1wk",
"version": "1.4.0"
"rev": "v2.0.0",
"sha256": "0c1mk63lh3qmj8pl80lyvvsgyg4gg7673abr8cfxrj45635h74z5",
"version": "2.0.0"
},
"logentries": {
"owner": "terraform-providers",
@ -619,9 +612,9 @@
"owner": "hashicorp",
"provider-source-address": "registry.terraform.io/hashicorp/null",
"repo": "terraform-provider-null",
"rev": "v2.1.2",
"sha256": "0di1hxmd3s80sz8hl5q2i425by8fbk15f0r4jmnm6vra0cq89jw2",
"version": "2.1.2"
"rev": "v3.0.0",
"sha256": "0r1kvsc96922i85hdvf1pk8aicxjr6bc69gc63qi21hrl0jpvr7r",
"version": "3.0.0"
},
"nutanix": {
"owner": "terraform-providers",
@ -789,9 +782,9 @@
"owner": "hashicorp",
"provider-source-address": "registry.terraform.io/hashicorp/random",
"repo": "terraform-provider-random",
"rev": "v2.2.1",
"sha256": "1qklsxj443vsj61lwl7qf7xwgnllwcvb2yk6s0kn9g3iq63pcv30",
"version": "2.2.1"
"rev": "v3.0.0",
"sha256": "00dkpcri9ckp0kxwgh3p8175cyd44m8z13cb013pm4mrr61n4wq9",
"version": "3.0.0"
},
"rightscale": {
"owner": "terraform-providers",
@ -909,9 +902,9 @@
"owner": "hashicorp",
"provider-source-address": "registry.terraform.io/hashicorp/template",
"repo": "terraform-provider-template",
"rev": "v2.1.2",
"sha256": "18w1mmma81m9j7yf6q500w8v9ss28w6sw2ynssl99pyw2gwmd04q",
"version": "2.1.2"
"rev": "v2.2.0",
"sha256": "12pn1i06jz4xl50md94yfdggg3pg5bv1viwf35izizm5rnyksyv2",
"version": "2.2.0"
},
"tencentcloud": {
"owner": "terraform-providers",
@ -946,9 +939,9 @@
"owner": "hashicorp",
"provider-source-address": "registry.terraform.io/hashicorp/tls",
"repo": "terraform-provider-tls",
"rev": "v2.1.1",
"sha256": "1qsx540pjcq4ra034q2dwnw5nmzab5h1c3vm20ppg5dkhhyiizq8",
"version": "2.1.1"
"rev": "v3.0.0",
"sha256": "1p9d5wrr4xwf2i930zlcarm1zl8ysj3nyc6rrbhpxk04kr6ap0wz",
"version": "3.0.0"
},
"triton": {
"owner": "terraform-providers",

View File

@ -4,11 +4,11 @@
stdenv.mkDerivation rec {
pname = "mcabber";
version = "1.1.0";
version = "1.1.2";
src = fetchurl {
url = "https://mcabber.com/files/mcabber-${version}.tar.bz2";
sha256 = "1ggh865p1rf10ffsnf4g6qv9i8bls36dxdb1nzs5r9vdqci2rz04";
sha256 = "0q1i5acyghsmzas88qswvki8kkk2nfpr8zapgnxbcd3lwcxl38f4";
};
nativeBuildInputs = [ pkgconfig ];

View File

@ -19,13 +19,13 @@
mkDerivation rec {
pname = "nextcloud-client";
version = "3.0.2";
version = "3.0.3";
src = fetchFromGitHub {
owner = "nextcloud";
repo = "desktop";
rev = "v${version}";
sha256 = "ROzaiRa9Odq4lXuHL7nbE0S49d0wxmDgm01qI1WM+WM=";
sha256 = "0idh8i71jivdjjs2y62l22yl3qxwgcr0hf53dad587bzgkkkr223";
};
patches = [

View File

@ -1,34 +1,40 @@
{stdenv, fetchFromGitHub, makeWrapper, gettext, python3Packages, rsync, cron, openssh, sshfs-fuse, encfs }:
{stdenv, lib, fetchFromGitHub, makeWrapper, gettext,
python3, rsync, cron, openssh, sshfs-fuse, encfs }:
let
inherit (python3Packages) python dbus-python keyring;
in stdenv.mkDerivation rec {
version = "1.1.24";
python' = python3.withPackages (ps: with ps; [ dbus-python keyring ]);
apps = lib.makeBinPath [ openssh python' cron rsync sshfs-fuse encfs ];
in stdenv.mkDerivation rec {
pname = "backintime-common";
version = "1.2.1";
src = fetchFromGitHub {
owner = "bit-team";
repo = "backintime";
rev = "v${version}";
sha256 = "0g6gabnr60ns8854hijdddbanks7319q4n3fj5l6rc4xsq0qck18";
sha256 = "mBjheLY7DHs995heZmxVnDdvABkAROCjRJ4a/uJmJcg=";
};
buildInputs = [ makeWrapper gettext python dbus-python keyring openssh cron rsync sshfs-fuse encfs ];
nativeBuildInputs = [ makeWrapper gettext ];
buildInputs = [ python' ];
installFlags = [ "DEST=$(out)" ];
preConfigure = "cd common";
preConfigure = ''
cd common
substituteInPlace configure \
--replace "/.." "" \
--replace "share/backintime" "${python'.sitePackages}/backintime"
substituteInPlace "backintime" \
--replace "share" "${python'.sitePackages}"
'';
dontAddPrefix = true;
preFixup =
''
substituteInPlace "$out/bin/backintime" \
--replace "=\"/usr/share" "=\"$prefix/share"
preFixup = ''
wrapProgram "$out/bin/backintime" \
--prefix PYTHONPATH : "$PYTHONPATH" \
--prefix PATH : "$prefix/bin:$PATH"
--prefix PATH : ${apps}
'';
meta = {

View File

@ -0,0 +1,26 @@
{ mkDerivation, backintime-common, python3 }:
let
python' = python3.withPackages (ps: with ps; [ pyqt5 backintime-common ]);
in
mkDerivation {
inherit (backintime-common)
version src installFlags meta dontAddPrefix nativeBuildInputs;
pname = "backintime-qt";
buildInputs = [ python' backintime-common ];
preConfigure = ''
cd qt
substituteInPlace configure \
--replace '"/../etc' '"/etc'
substituteInPlace qttools.py \
--replace "__file__, os.pardir, os.pardir" '"${backintime-common}/${python'.sitePackages}/backintime"'
'';
preFixup = ''
wrapQtApp "$out/bin/backintime-qt" \
--prefix PATH : "${backintime-common}/bin:$PATH"
'';
}

View File

@ -1,28 +0,0 @@
{stdenv, makeWrapper, gettext, backintime-common, python3, python3Packages }:
stdenv.mkDerivation {
inherit (backintime-common) version src installFlags;
pname = "backintime-qt4";
buildInputs = [ makeWrapper gettext python3 python3Packages.pyqt4 backintime-common python3 ];
preConfigure = "cd qt4";
configureFlags = [ ];
dontAddPrefix = true;
preFixup =
''
substituteInPlace "$out/bin/backintime-qt4" \
--replace "=\"/usr/share" "=\"$prefix/share"
wrapProgram "$out/bin/backintime-qt4" \
--prefix PYTHONPATH : "${backintime-common}/share/backintime/common:$PYTHONPATH" \
--prefix PATH : "${backintime-common}/bin:$PATH"
'';
meta = with stdenv.lib; {
broken = true;
};
}

View File

@ -3,17 +3,17 @@
let
common = { stname, target, postInstall ? "" }:
buildGoModule rec {
version = "1.10.0";
version = "1.11.1";
name = "${stname}-${version}";
src = fetchFromGitHub {
owner = "syncthing";
repo = "syncthing";
rev = "v${version}";
sha256 = "0wi8k248qr80vscb5qwh2ygiyy2am9hh6a8c1il1h2702ch2cd45";
sha256 = "0x5a24r74i9am6a8k32qkb2vck28d2jiy4yhpb4g774m4krjqxd2";
};
vendorSha256 = "0as1kn7bpgp5b82pf1bgr23az1qq8x85zr2zwgqsx57yjbc18658";
vendorSha256 = "0ap287996ix119hkdyssn2q2bqjbgdshi9a67hf8msfp7k9warm7";
doCheck = false;

View File

@ -1,25 +0,0 @@
{ stdenv, fetchurl, zlib, unzip }:
stdenv.mkDerivation {
name = "plink-1.07";
src = fetchurl {
url = "http://pngu.mgh.harvard.edu/~purcell/plink/dist/plink-1.07-src.zip";
sha256 = "4af56348443d0c6a1db64950a071b1fcb49cc74154875a7b43cccb4b6a7f482b";
};
buildInputs = [ zlib unzip ] ;
installPhase = ''
mkdir -p $out/bin
cp plink $out/bin
'';
meta = {
description = "Whole genome association toolkit";
homepage = "http://pngu.mgh.harvard.edu/~purcell/plink/";
license = stdenv.lib.licenses.gpl2;
platforms = stdenv.lib.platforms.all;
broken = true;
};
}

View File

@ -4,12 +4,12 @@ with stdenv.lib;
stdenv.mkDerivation rec {
pname = "marvin";
version = "20.17.0";
version = "20.19.0";
src = fetchurl {
name = "marvin-${version}.deb";
url = "http://dl.chemaxon.com/marvin/${version}/marvin_linux_${versions.majorMinor version}.deb";
sha256 = "0ip6ma9ivk5b74s9najn2rrkiha7hya1rjhgyrc71kwsj5gqgli0";
sha256 = "0b9a0yl3mxfb2dfdkgs2wphhxsgwixqk6nl2hsn1ly3gz53cws1q";
};
nativeBuildInputs = [ dpkg makeWrapper ];

View File

@ -4,11 +4,11 @@
buildPythonApplication rec {
pname = "git-machete";
version = "2.15.6";
version = "2.15.7";
src = fetchPypi {
inherit pname version;
sha256 = "0ajb3m3i3pfc5v3gshglk7qphk1rpniwx8q8isgx1a6cyarzr9bd";
sha256 = "0djbl4s9i7bs7kkldr7453yayi38s8mx0i41mkd0j2cvv5r9himr";
};
nativeBuildInputs = [ installShellFiles pbr ];

View File

@ -1,24 +0,0 @@
{ stdenv, fetchurl, perl }:
stdenv.mkDerivation {
name = "dvb-apps-7f68f9c8d311";
src = fetchurl {
url = "https://linuxtv.org/hg/dvb-apps/archive/7f68f9c8d311.tar.gz";
sha256 = "0a6c5jjq6ad98bj0r954l3n7zjb2syw9m19jksg06z4zg1z8yg82";
};
buildInputs = [ perl ];
dontConfigure = true; # skip configure
installPhase = "make prefix=$out install";
meta = {
description = "Linux DVB API applications and utilities";
homepage = "https://linuxtv.org/";
platforms = stdenv.lib.platforms.linux;
license = stdenv.lib.licenses.gpl2;
broken = true; # 2018-04-10
};
}

View File

@ -1,6 +1,27 @@
{ stdenv, fetchFromGitHub, makeWrapper, bash, nodejs, nodePackages, gzip }:
{ stdenv
, fetchFromGitHub
, common-updater-scripts
, genericUpdater
, writers
, makeWrapper
, bash
, nodejs
, nodePackages
, gzip
, jq
}:
let
# NOTE: use updateScript to bump the package version
pname = "EPGStation";
version = "1.7.5";
src = fetchFromGitHub {
owner = "l3tnun";
repo = "EPGStation";
rev = "v${version}";
sha256 = "06yaf5yb5rp3q0kdhw33df7px7vyfby885ckb6bdzw3wnams5d8m";
};
workaround-opencollective-buildfailures = stdenv.mkDerivation {
# FIXME: This should be removed when a complete fix is available
# https://github.com/svanderburg/node2nix/issues/145
@ -12,14 +33,9 @@ let
chmod +x $out/bin/opencollective-postinstall
'';
};
in
nodePackages.epgstation.override (drv: {
src = fetchFromGitHub {
owner = "l3tnun";
repo = "EPGStation";
rev = "v${drv.version}"; # version specified in ./generate.sh
sha256 = "15z1kdbamj97frp3dfnbm0h8krihmv2xdab4id0rxin29ibrw1k2";
};
pkg = nodePackages.epgstation.override (drv: {
inherit src;
buildInputs = [ bash ];
nativeBuildInputs = [
@ -55,9 +71,9 @@ nodePackages.epgstation.override (drv: {
rm -rf logs
# Replace the existing configuration and runtime state directories with
# symlinks. Without this, they would all be non-writable because they reside
# in the Nix store. Note that the source path won't be accessible at build
# time.
# symlinks. Without this, they would all be non-writable because they
# reside in the Nix store. Note that the source path won't be accessible
# at build time.
rm -r config data recorded thumbnail
ln -sfT /etc/epgstation config
ln -sfT /var/lib/epgstation data
@ -71,8 +87,19 @@ nodePackages.epgstation.override (drv: {
popd
'';
meta = with stdenv.lib; drv.meta // {
maintainers = with maintainers; [ midchildan ];
# NOTE: this may take a while since it has to update all packages in
# nixpkgs.nodePackages
passthru.updateScript = import ./update.nix {
inherit (stdenv) lib;
inherit (src.meta) homepage;
inherit
pname
version
common-updater-scripts
genericUpdater
writers
jq;
};
# nodePackages.epgstation is a stub package to fetch npm dependencies and
# is marked as broken to prevent users from installing it directly. This
@ -80,6 +107,16 @@ nodePackages.epgstation.override (drv: {
# nixpkgs while still allowing us to heavily customize the build. It also
# allows us to provide devDependencies for the epgstation build process
# without doing the same for all the other node packages.
broken = false;
meta = drv.meta // { broken = false; };
});
in
pkg // {
name = "${pname}-${version}";
meta = with stdenv.lib; pkg.meta // {
maintainers = with maintainers; [ midchildan ];
# NOTE: updateScript relies on this being correct
position = toString ./default.nix + ":1";
};
})
}

View File

@ -1,34 +0,0 @@
#!/usr/bin/env bash
# Script to generate the Nix package definition for EPGStation. Run this script
# when bumping the package version.
VERSION="1.7.4"
URL="https://raw.githubusercontent.com/l3tnun/EPGStation/v$VERSION/package.json"
JQ_BIN="$(nix-build ../../../.. --no-out-link -A jq)/bin/jq"
set -eu -o pipefail
cd "$(dirname "${BASH_SOURCE[0]}")"
main() {
# update package.json
curl -sSfL "$URL" \
| jq '. + {"dependencies": (.devDependencies + .dependencies)} | del(.devDependencies)' \
> package.json
# regenerate node packages to update the actual Nix package
pushd ../../../development/node-packages \
&& ./generate.sh
popd
# generate default streaming settings for EPGStation
pushd ../../../../nixos/modules/services/video/epgstation \
&& cat "$(./generate)" > streaming.json
popd
}
jq() {
"$JQ_BIN" "$@"
}
main "@"

View File

@ -1,23 +1,7 @@
{
"name": "EPGStation",
"version": "1.7.4",
"version": "1.7.5",
"description": "DTV Software in Japan.",
"main": "dist/server/index.js",
"scripts": {
"start": "node dist/server/index.js",
"dev-start": "node dist/server/index.js --env development",
"clean": "gulp clean",
"build": "gulp build --max_old_space_size=768 --env production",
"dev-build": "gulp build --max_old_space_size=512 --env development",
"test": "echo \"Error: no test specified\" && exit 1",
"task": "gulp --max_old_space_size=512",
"install-win-service": "winser -i -a",
"uninstall-win-service": "winser -r -x",
"backup": "node dist/server/DBTools.js -m backup -o",
"restore": "node dist/server/DBTools.js -m restore -o",
"move-log": "node dist/server/LogFileMoveTools.js",
"convert-str": "node dist/server/ConvertDBStrTools.js"
},
"repository": {
"type": "git",
"url": "https://github.com/l3tnun/EPGStation.git"
@ -32,23 +16,53 @@
"node": "^10.x.x < 11 || ^12.14.0 < 13 || ^14.5.0 < 15"
},
"dependencies": {
"aribts": "^2.1.12",
"b24.js": "1.0.3",
"basic-auth": "2.0.1",
"body-parser": "1.19.0",
"chart.js": "2.9.3",
"css-ripple-effect": "1.0.5",
"diskusage": "1.1.3",
"express": "4.17.1",
"express-openapi": "7.0.1",
"fs-extra": "9.0.1",
"hls-b24.js": "0.12.3",
"js-yaml": "3.14.0",
"lodash": "4.17.20",
"log4js": "6.3.0",
"material-design-icons": "3.0.1",
"material-design-lite": "1.3.0",
"minimist": "1.2.5",
"mirakurun": "3.3.1",
"mithril": "2.0.4",
"mkdirp": "1.0.4",
"multer": "1.4.2",
"mysql": "2.18.1",
"openapi-types": "7.0.1",
"pg": "8.3.3",
"request": "2.88.2",
"socket.io": "2.3.0",
"socket.io-client": "2.3.0",
"sqlite3": "5.0.0",
"swagger-ui-dist": "3.34.0",
"url-join": "4.0.1",
"@types/basic-auth": "1.1.3",
"@types/body-parser": "1.19.0",
"@types/chart.js": "2.9.23",
"@types/express": "4.17.7",
"@types/hls.js": "0.13.0",
"@types/chart.js": "2.9.24",
"@types/express": "4.17.8",
"@types/hls.js": "0.13.1",
"@types/js-yaml": "3.12.5",
"@types/lodash": "4.14.158",
"@types/lodash": "4.14.161",
"@types/material-design-lite": "1.1.16",
"@types/minimist": "1.2.0",
"@types/mithril": "2.0.3",
"@types/mkdirp": "1.0.1",
"@types/multer": "1.4.3",
"@types/multer": "1.4.4",
"@types/mysql": "2.15.15",
"@types/node": "14.0.26",
"@types/pg": "7.14.4",
"@types/node": "14.11.1",
"@types/pg": "7.14.5",
"@types/request": "2.48.5",
"@types/socket.io": "2.1.10",
"@types/socket.io": "2.1.11",
"@types/socket.io-client": "1.4.33",
"@types/sqlite3": "3.1.6",
"@types/url-join": "4.0.0",
@ -61,41 +75,11 @@
"gulp-sourcemaps": "2.6.5",
"gulp-tslint": "8.1.4",
"gulp-typescript": "5.0.1",
"terser-webpack-plugin": "3.0.7",
"ts-loader": "8.0.1",
"tslint": "6.1.2",
"typescript": "3.9.7",
"webpack": "4.44.0",
"webpack-stream": "5.2.1",
"aribts": "^2.1.12",
"b24.js": "1.0.3",
"basic-auth": "2.0.1",
"body-parser": "1.19.0",
"chart.js": "2.9.3",
"css-ripple-effect": "1.0.5",
"diskusage": "1.1.3",
"express": "4.17.1",
"express-openapi": "7.0.0",
"fs-extra": "9.0.1",
"hls-b24.js": "0.12.3",
"js-yaml": "3.14.0",
"lodash": "4.17.19",
"log4js": "6.3.0",
"material-design-icons": "3.0.1",
"material-design-lite": "1.3.0",
"minimist": "1.2.5",
"mirakurun": "3.2.0",
"mithril": "2.0.4",
"mkdirp": "1.0.4",
"multer": "1.4.2",
"mysql": "2.18.1",
"openapi-types": "7.0.0",
"pg": "8.3.0",
"request": "2.88.2",
"socket.io": "2.3.0",
"socket.io-client": "2.3.0",
"sqlite3": "5.0.0",
"swagger-ui-dist": "3.30.2",
"url-join": "4.0.1"
"terser-webpack-plugin": "4.2.2",
"ts-loader": "8.0.4",
"tslint": "6.1.3",
"typescript": "4.0.3",
"webpack": "4.44.2",
"webpack-stream": "6.1.0"
}
}

View File

@ -0,0 +1,66 @@
{ pname
, version
, homepage
, lib
, common-updater-scripts
, genericUpdater
, writers
, jq
}:
let
updater = genericUpdater {
inherit pname version;
attrPath = lib.toLower pname;
rev-prefix = "v";
versionLister = "${common-updater-scripts}/bin/list-git-tags ${homepage}";
};
updateScript = builtins.elemAt updater 0;
updateArgs = map (lib.escapeShellArg) (builtins.tail updater);
in writers.writeBash "update-epgstation" ''
set -euxo pipefail
# bump the version
${updateScript} ${lib.concatStringsSep " " updateArgs}
cd "${toString ./.}"
# Get the path to the latest source. Note that we can't just pass the value
# of epgstation.src directly because it'd be evaluated before we can run
# updateScript.
SRC="$(nix-build ../../../.. --no-out-link -A epgstation.src)"
if [[ "${version}" == "$(${jq}/bin/jq -r .version "$SRC/package.json")" ]]; then
echo "[INFO] Already using the latest version of ${pname}" >&2
exit
fi
# Regenerate package.json from the latest source.
${jq}/bin/jq '. + {
dependencies: (.dependencies + .devDependencies),
} | del(.devDependencies, .main, .scripts)' \
"$SRC/package.json" \
> package.json
# Regenerate node packages to update the pre-overriden epgstation derivation.
# This must come *after* package.json has been regenerated.
pushd ../../../development/node-packages
./generate.sh
popd
# Generate default streaming settings for the nixos module.
pushd ../../../../nixos/modules/services/video/epgstation
${jq}/bin/jq '
{ liveHLS
, liveMP4
, liveWebM
, mpegTsStreaming
, mpegTsViewer
, recordedDownloader
, recordedStreaming
, recordedHLS
, recordedViewer
}' \
"$SRC/config/config.sample.json" \
> streaming.json
popd
''

View File

@ -35,13 +35,13 @@ let
in
stdenv.mkDerivation rec {
pname = "crun";
version = "0.15";
version = "0.15.1";
src = fetchFromGitHub {
owner = "containers";
repo = pname;
rev = version;
sha256 = "0cqzk2lm1w0g2v6qhiliq565cf4p7hzh839jb01p3i5cr9kx11kc";
sha256 = "0qy4159wirkwzb48kp1jsnimlr1fyvxvv02j6mdbhjdhkwjic8v4";
fetchSubmodules = true;
};

View File

@ -0,0 +1,21 @@
diff --git a/Makefile b/Makefile
index 0070ada..802cef0 100644
--- a/Makefile
+++ b/Makefile
@@ -202,7 +202,7 @@ $(BIN_NAME): $(BIN_OBJS)
##### Public rules #####
all: CPPFLAGS += -DNDEBUG
-all: shared static tools
+all: shared tools
# Run with ASAN_OPTIONS="protect_shadow_gap=0" to avoid CUDA OOM errors
debug: CFLAGS += -pedantic -fsanitize=undefined -fno-omit-frame-pointer -fno-common -fsanitize=address
@@ -232,7 +232,6 @@ install: all
# Install header files
$(INSTALL) -m 644 $(LIB_INCS) $(DESTDIR)$(includedir)
# Install library files
- $(INSTALL) -m 644 $(LIB_STATIC) $(DESTDIR)$(libdir)
$(INSTALL) -m 755 $(LIB_SHARED) $(DESTDIR)$(libdir)
$(LN) -sf $(LIB_SONAME) $(DESTDIR)$(libdir)/$(LIB_SYMLINK)
$(LDCONFIG) -n $(DESTDIR)$(libdir)

View File

@ -1,17 +1,23 @@
{ stdenv, lib, fetchFromGitHub, pkgconfig, libelf, libcap, libseccomp }:
with lib; let
{ stdenv
, lib
, fetchFromGitHub
, pkgconfig
, libelf
, libcap
, libseccomp
, rpcsvc-proto
, libtirpc
}:
let
modp-ver = "396.51";
nvidia-modprobe = fetchFromGitHub {
owner = "NVIDIA";
repo = "nvidia-modprobe";
rev = modp-ver;
sha256 = "1fw2qwc84k64agw6fx2v0mjf88aggph9c6qhs4cv7l3gmflv8qbk";
};
in stdenv.mkDerivation rec {
in
stdenv.mkDerivation rec {
pname = "libnvidia-container";
version = "1.0.6";
@ -22,19 +28,32 @@ in stdenv.mkDerivation rec {
sha256 = "1pnpc9knwh8d1zqb28zc3spkjc00w0z10vd3jna8ksvpl35jl7w3";
};
patches = [
# locations of nvidia-driver libraries are not resolved via ldconfig which
# doesn't get used on NixOS. Additional support binaries like nvidia-smi are
# not resolved via the environment PATH but via the derivation output path.
patches = [ ./libnvc-ldconfig-and-path-fixes.patch ];
# doesn't get used on NixOS. Additional support binaries like nvidia-smi
# are not resolved via the environment PATH but via the derivation output
# path.
./libnvc-ldconfig-and-path-fixes.patch
# the libnvidia-container Makefile wants to build and install static
# libtirpc libraries; this patch prevents that from happening
./avoid-static-libtirpc-build.patch
];
makeFlags = [
"WITH_LIBELF=yes"
"prefix=$(out)"
# we can't use the WITH_TIRPC=yes flag that exists in the Makefile for the
# same reason we patch out the static library use of libtirpc so we set the
# define in CFLAGS
"CFLAGS=-DWITH_TIRPC"
];
postPatch = ''
sed -i 's/^REVISION :=.*/REVISION = ${src.rev}/' mk/common.mk
sed -i 's/^COMPILER :=.*/COMPILER = $(CC)/' mk/common.mk
sed -i \
-e 's/^REVISION :=.*/REVISION = ${src.rev}/' \
-e 's/^COMPILER :=.*/COMPILER = $(CC)/' \
mk/common.mk
mkdir -p deps/src/nvidia-modprobe-${modp-ver}
cp -r ${nvidia-modprobe}/* deps/src/nvidia-modprobe-${modp-ver}
@ -42,11 +61,14 @@ in stdenv.mkDerivation rec {
touch deps/src/nvidia-modprobe-${modp-ver}/.download_stamp
'';
nativeBuildInputs = [ pkgconfig ];
NIX_CFLAGS_COMPILE = [ "-I${libtirpc.dev}/include/tirpc" ];
NIX_LDFLAGS = [ "-L${libtirpc.dev}/lib" "-ltirpc" ];
buildInputs = [ libelf libcap libseccomp ];
nativeBuildInputs = [ pkgconfig rpcsvc-proto ];
meta = {
buildInputs = [ libelf libcap libseccomp libtirpc ];
meta = with lib; {
homepage = "https://github.com/NVIDIA/libnvidia-container";
description = "NVIDIA container runtime library";
license = licenses.bsd3;

View File

@ -5,11 +5,11 @@
stdenv.mkDerivation rec {
pname = "i3";
version = "4.18.2";
version = "4.18.3";
src = fetchurl {
url = "https://i3wm.org/downloads/${pname}-${version}.tar.bz2";
sha256 = "030jym6b8b07yf4y6pb806hg8k77zsprv569gy0r72rh5zb1g1mj";
sha256 = "03dijnwv2n8ak9jq59fhq0rc80m5wjc9d54fslqaivnnz81pkbjk";
};
nativeBuildInputs = [ which pkgconfig makeWrapper installShellFiles ];

View File

@ -1,4 +1,13 @@
{ stdenv, rustPlatform, fetchFromGitHub, pkgconfig, dbus, libpulseaudio }:
{ stdenv
, rustPlatform
, fetchFromGitHub
, pkgconfig
, makeWrapper
, dbus
, libpulseaudio
, notmuch
, ethtool
}:
rustPlatform.buildRustPackage rec {
pname = "i3status-rust";
@ -13,9 +22,17 @@ rustPlatform.buildRustPackage rec {
cargoSha256 = "1dcfclk8lbqvq2hywr80jm63p1i1kz3893zq99ipgryia46vd397";
nativeBuildInputs = [ pkgconfig ];
nativeBuildInputs = [ pkgconfig makeWrapper ];
buildInputs = [ dbus libpulseaudio ];
buildInputs = [ dbus libpulseaudio notmuch ];
cargoBuildFlags = [
"--features=notmuch"
];
postFixup = ''
wrapProgram $out/bin/i3status-rs --prefix PATH : "${ethtool}/bin"
'';
# Currently no tests are implemented, so we avoid building the package twice
doCheck = false;

View File

@ -9,13 +9,13 @@
with stdenv.lib;
stdenv.mkDerivation rec {
pname = "icewm";
version = "1.8.3";
version = "1.9.0";
src = fetchFromGitHub {
owner = "bbidulock";
repo = pname;
rev = version;
sha256 = "sha256-cTJQlUaGvbJmu1xYwFp5GPrM5NjfKBzaMc+l7FeMUP8=";
sha256 = "08prc9ip96bxbmkkab0ymma9yisgs5yzymg4gjcvr945bj4q7crb";
};
nativeBuildInputs = [ cmake pkgconfig perl asciidoc ];

View File

@ -0,0 +1,25 @@
{ lib, fetchzip }:
let
version = "1.000";
in
fetchzip {
name = "fraunces-${version}";
url = "https://github.com/undercasetype/Fraunces/releases/download/${version}/UnderCaseType_Fraunces_${version}.zip";
sha256 = "0qgl140qkn9p87x7pk60fd3lj206y5h0fq2xkcj2qiv3sxbqxwqb";
postFetch = ''
mkdir -p $out/share/fonts/
unzip -j $downloadedFile \*.otf -d $out/share/fonts/opentype
unzip -j $downloadedFile \*.ttf -d $out/share/fonts/truetype
'';
meta = with lib; {
description = "A display, Old Style soft-serif typeface inspired by early 20th century typefaces";
homepage = "https://github.com/undercasetype/Fraunces";
license = licenses.ofl;
maintainers = [ maintainers.marsam ];
platforms = platforms.all;
};
}

View File

@ -1,4 +1,4 @@
{ stdenv, fetchFromGitHub, inkscape, xcursorgen }:
{ stdenv, fetchFromGitHub, fetchpatch, inkscape, xcursorgen }:
stdenv.mkDerivation rec {
version = "1.1";
@ -14,6 +14,15 @@ stdenv.mkDerivation rec {
nativeBuildInputs = [ inkscape xcursorgen ];
patches = [
# Remove when https://github.com/numixproject/numix-cursor-theme/pull/7 is merged
(fetchpatch {
url = "https://github.com/stephaneyfx/numix-cursor-theme/commit/3b647bf768cebb8f127b88e3786f6a9640460197.patch";
sha256 = "174kmhlvv76wwvndkys78aqc32051sqg3wzc0xg6b7by4agrbg76";
name = "support-inkscape-1-in-numix-cursor-theme.patch";
})
];
buildPhase = ''
patchShebangs .
HOME=$TMP ./build.sh

View File

@ -2,13 +2,13 @@
stdenv.mkDerivation rec {
pname = "numix-icon-theme-circle";
version = "20.07.11";
version = "20.09.19";
src = fetchFromGitHub {
owner = "numixproject";
repo = pname;
rev = version;
sha256 = "0vj3d3wb12ksnkm99s32k7nrf9m5j83zzvkd0rwk8l0b30df975j";
sha256 = "1rqlq5ssxqj0nc0i8av7zprj94km5645xzqi5j5i0sxd3jbmyfjx";
};
nativeBuildInputs = [ gtk3 ];

View File

@ -2,13 +2,13 @@
stdenv.mkDerivation rec {
pname = "numix-icon-theme-square";
version = "20.07.11";
version = "20.09.19";
src = fetchFromGitHub {
owner = "numixproject";
repo = pname;
rev = version;
sha256 = "07jy8l2r6grn7pabn3dnkc8j7xdykl7k57br30c6v61ss8paf2rw";
sha256 = "0afraarfcd66mpidmn0l90wif8kmwzdj3s09g704kwszyijxs80z";
};
nativeBuildInputs = [ gtk3 ];

View File

@ -1,30 +0,0 @@
{ stdenv, fetchFromGitHub }:
stdenv.mkDerivation rec {
pname = "gnome-shell-extension-battery-status";
version = "6";
src = fetchFromGitHub {
owner = "milliburn";
repo = "gnome-shell-extension-battery_status";
rev = "v${version}";
sha256 = "1w83h863mzffjnmk322xq90qf3y9dzay1w9yw5r0qnbsq1ljl8p4";
};
uuid = "battery_status@milliburn.github.com";
installPhase = ''
runHook preInstall
mkdir -p $out/share/gnome-shell/extensions
cp -r ${uuid} $out/share/gnome-shell/extensions/
runHook postInstall
'';
meta = with stdenv.lib; {
description = "Configurable lightweight battery charge indicator and autohider";
license = licenses.gpl2;
broken = true; # not compatable with latest GNOME
maintainers = with maintainers; [ jonafato ];
homepage = "https://github.com/milliburn/gnome-shell-extension-battery_status";
};
}

View File

@ -2,11 +2,11 @@
stdenv.mkDerivation rec {
pname = "lxtask";
version = "0.1.9";
version = "0.1.10";
src = fetchurl {
url = "mirror://sourceforge/lxde/${pname}-${version}.tar.xz";
sha256 = "0cv4hx5dg01hbyi5p10pl78n0a40xajpq4wx9c7886pkmpq8isj1";
sha256 = "0b2fxg8jjjpk219gh7qa18g45365598nd2bq7rrq0bdvqjdxy5i2";
};
nativeBuildInputs = [ pkgconfig intltool ];

View File

@ -2,6 +2,14 @@
, fetchurl, perl, gcc
, ncurses6, gmp, glibc, libiconv, numactl
, llvmPackages
# minimal = true; will remove files that aren't strictly necessary for
# regular builds and GHC bootstrapping.
# This is "useful" for staying within hydra's output limits for at least the
# aarch64-linux architecture.
# Examples of unnecessary files are the bundled documentation and files that
# are only needed for profiling builds.
, minimal ? false
}:
# Prebuilt only does native
@ -172,6 +180,13 @@ stdenv.mkDerivation rec {
for file in $(find "$out" -name setup-config); do
substituteInPlace $file --replace /usr/bin/ranlib "$(type -P ranlib)"
done
'' +
stdenv.lib.optionalString minimal ''
# Remove profiling objects
find $out -type f -name '*.p_o' -delete
rm $out/lib/ghc-*/bin/ghc-iserv-prof
# Remove docs
rm -r $out/share/{doc,man}
'';
doInstallCheck = true;
@ -195,11 +210,18 @@ stdenv.mkDerivation rec {
enableShared = true;
};
meta = {
meta = let
platforms = ["x86_64-linux" "armv7l-linux" "aarch64-linux" "i686-linux" "x86_64-darwin"];
in {
homepage = "http://haskell.org/ghc";
description = "The Glasgow Haskell Compiler";
license = stdenv.lib.licenses.bsd3;
platforms = ["x86_64-linux" "armv7l-linux" "aarch64-linux" "i686-linux" "x86_64-darwin"];
# The minimal variation can not be distributed because it removes the
# documentation, including licensing information that is required for
# distribution.
inherit platforms;
hydraPlatforms = stdenv.lib.optionals (!minimal) platforms;
maintainers = with stdenv.lib.maintainers; [ lostnet ];
};
}

View File

@ -2,13 +2,13 @@
rustPlatform.buildRustPackage rec {
pname = "gleam";
version = "0.11.2";
version = "0.12.0";
src = fetchFromGitHub {
owner = "gleam-lang";
repo = pname;
rev = "v${version}";
sha256 = "1g8yfp1xpkv1lqz8azam40cvrs5cggxlyrb72h8k88br75qmi6hj";
sha256 = "1hlbskpfqdh5avmqnry69s7x0wj6l6yaqkayx7lj6z99p58p9zrz";
};
nativeBuildInputs = [ pkg-config ];
@ -16,7 +16,7 @@ rustPlatform.buildRustPackage rec {
buildInputs = [ openssl ] ++
stdenv.lib.optionals stdenv.isDarwin [ Security ];
cargoSha256 = "1gfr6c4i5kx8x3q23s4b4n25z2k6xkxpk12acr4ry97pyj2lr5wq";
cargoSha256 = "1djznh7v6ha4ks8l8arwwn301qclmb7iih774q5y7sbzqrv7sw0q";
meta = with stdenv.lib; {
description = "A statically typed language for the Erlang VM";

View File

@ -0,0 +1,167 @@
{ stdenv, fetchurl, perl, unzip, glibc, zlib, setJavaClassPath }:
let
common = javaVersion:
let
graalvmXXX-ce = stdenv.mkDerivation rec {
pname = "graalvm${javaVersion}-ce";
version = "20.2.0";
srcs = [
(fetchurl {
sha256 = { "8" = "1s64zkkrns1ykh6dwpjrqy0hs9m1bb08cf7ss7msx33h9ivir5b0";
"11" = "0aaf0sjsnlckhgsh3j4lph0shahw6slf4yndqcm2swc8i1dlpdsx";
}.${javaVersion};
url = "https://github.com/graalvm/graalvm-ce-builds/releases/download/vm-${version}/graalvm-ce-java${javaVersion}-linux-amd64-${version}.tar.gz";
})
(fetchurl {
sha256 = { "8" = "1cisyyzab4pdvzavnivhy9w6dwn36ybaxw40w767m142fbi06m3b";
"11" = "0p4j6mxajmb0xl41c79154pk4vb8bffgg1nmwislahqjky9jkd4j";
}.${javaVersion};
url = "https://github.com/graalvm/graalvm-ce-builds/releases/download/vm-${version}/native-image-installable-svm-java${javaVersion}-linux-amd64-${version}.jar";
})
(fetchurl {
sha256 = { "8" = "0rwwvk1mkfnl0b50xg7kh6015kjmsw2ra0ckrzmabl88z4bnzh2y";
"11" = "0lc9as2a00j74lp7jby4p10vn5bbkiydzvzk28zfcbsp28p4wvwn";
}.${javaVersion};
url = "https://github.com/oracle/truffleruby/releases/download/vm-${version}/ruby-installable-svm-java${javaVersion}-linux-amd64-${version}.jar";
})
(fetchurl {
sha256 = { "8" = "0mj8p72qgvvrwpsbk0bsqldynlz1wq07icf951wq5xdbr0whj1gz";
"11" = "1lkszqn4islsza011iabayv6riym0dwnkv83pkmk06b230qjfhzb";
}.${javaVersion};
url = "https://github.com/graalvm/graalpython/releases/download/vm-${version}/python-installable-svm-java${javaVersion}-linux-amd64-${version}.jar";
})
(fetchurl {
sha256 = { "8" = "1br7camk7y8ych43ws57096100f9kzjvqznh2flmws78ipcrrb66";
"11" = "10swxspjvzh0j82lbpy38dckk69lw1pawqkhnj1hxd05ls36fwq5";
}.${javaVersion};
url = "https://github.com/graalvm/graalvm-ce-builds/releases/download/vm-${version}/wasm-installable-svm-java${javaVersion}-linux-amd64-${version}.jar";
})
];
nativeBuildInputs = [ unzip perl ];
unpackPhase = ''
unpack_jar() {
jar=$1
unzip -o $jar -d $out
perl -ne 'use File::Path qw(make_path);
use File::Basename qw(dirname);
if (/^(.+) = (.+)$/) {
make_path dirname("$ENV{out}/$1");
system "ln -s $2 $ENV{out}/$1";
}' $out/META-INF/symlinks
perl -ne 'if (/^(.+) = ([r-])([w-])([x-])([r-])([w-])([x-])([r-])([w-])([x-])$/) {
my $mode = ($2 eq 'r' ? 0400 : 0) + ($3 eq 'w' ? 0200 : 0) + ($4 eq 'x' ? 0100 : 0) +
($5 eq 'r' ? 0040 : 0) + ($6 eq 'w' ? 0020 : 0) + ($7 eq 'x' ? 0010 : 0) +
($8 eq 'r' ? 0004 : 0) + ($9 eq 'w' ? 0002 : 0) + ($10 eq 'x' ? 0001 : 0);
chmod $mode, "$ENV{out}/$1";
}' $out/META-INF/permissions
rm -rf $out/META-INF
}
mkdir -p $out
arr=($srcs)
tar xf ''${arr[0]} -C $out --strip-components=1
unpack_jar ''${arr[1]}
unpack_jar ''${arr[2]}
unpack_jar ''${arr[3]}
unpack_jar ''${arr[4]}
'';
installPhase = {
"8" = ''
# BUG workaround http://mail.openjdk.java.net/pipermail/graal-dev/2017-December/005141.html
substituteInPlace $out/jre/lib/security/java.security \
--replace file:/dev/random file:/dev/./urandom \
--replace NativePRNGBlocking SHA1PRNG
# provide libraries needed for static compilation
for f in ${glibc}/lib/* ${glibc.static}/lib/* ${zlib.static}/lib/*; do
ln -s $f $out/jre/lib/svm/clibraries/linux-amd64/$(basename $f)
done
# allow using external truffle-api.jar and languages not included in the distrubution
rm $out/jre/lib/jvmci/parentClassLoader.classpath
'';
"11" = ''
# BUG workaround http://mail.openjdk.java.net/pipermail/graal-dev/2017-December/005141.html
substituteInPlace $out/conf/security/java.security \
--replace file:/dev/random file:/dev/./urandom \
--replace NativePRNGBlocking SHA1PRNG
# provide libraries needed for static compilation
for f in ${glibc}/lib/* ${glibc.static}/lib/* ${zlib.static}/lib/*; do
ln -s $f $out/lib/svm/clibraries/linux-amd64/$(basename $f)
done
'';
}.${javaVersion};
dontStrip = true;
# copy-paste openjdk's preFixup
preFixup = ''
# Set JAVA_HOME automatically.
mkdir -p $out/nix-support
cat <<EOF > $out/nix-support/setup-hook
if [ -z "\''${JAVA_HOME-}" ]; then export JAVA_HOME=$out; fi
EOF
'';
postFixup = ''
rpath="${ { "8" = "$out/jre/lib/amd64/jli:$out/jre/lib/amd64/server:$out/jre/lib/amd64";
"11" = "$out/lib/jli:$out/lib/server:$out/lib";
}.${javaVersion}
}:${
stdenv.lib.makeLibraryPath [
stdenv.cc.cc.lib # libstdc++.so.6
zlib # libz.so.1
]}"
for f in $(find $out -type f -perm -0100); do
patchelf --interpreter "$(cat $NIX_CC/nix-support/dynamic-linker)" "$f" || true
patchelf --set-rpath "$rpath" "$f" || true
if ldd "$f" | fgrep 'not found'; then echo "in file $f"; fi
done
'';
propagatedBuildInputs = [ setJavaClassPath zlib ]; # $out/bin/native-image needs zlib to build native executables
doInstallCheck = true;
installCheckPhase = ''
echo ${stdenv.lib.escapeShellArg ''
public class HelloWorld {
public static void main(String[] args) {
System.out.println("Hello World");
}
}
''} > HelloWorld.java
$out/bin/javac HelloWorld.java
# run on JVM with Graal Compiler
$out/bin/java -XX:+UnlockExperimentalVMOptions -XX:+EnableJVMCI -XX:+UseJVMCICompiler HelloWorld | fgrep 'Hello World'
# Ahead-Of-Time compilation
$out/bin/native-image --no-server HelloWorld
./helloworld | fgrep 'Hello World'
# Ahead-Of-Time compilation with --static
$out/bin/native-image --no-server --static HelloWorld
./helloworld | fgrep 'Hello World'
'';
passthru.home = graalvmXXX-ce;
meta = with stdenv.lib; {
homepage = "https://www.graalvm.org/";
description = "High-Performance Polyglot VM";
license = with licenses; [ upl gpl2Classpath bsd3 ];
maintainers = with maintainers; [ bandresen volth hlolli glittershark ];
platforms = [ "x86_64-linux" ];
};
};
in
graalvmXXX-ce;
in {
graalvm8-ce = common "8";
graalvm11-ce = common "11";
}

View File

@ -1,42 +1,43 @@
{ llvmPackages, lib, fetchFromGitHub, cmake
, libpng, libjpeg, mesa, eigen
, openblas, blas, lapack
{ llvmPackages
, lib
, fetchFromGitHub
, cmake
, libpng
, libjpeg
, mesa
, eigen
, openblas
, blas
, lapack
}:
assert blas.implementation == "openblas" && lapack.implementation == "openblas";
let
version = "2019_08_27";
in llvmPackages.stdenv.mkDerivation {
name = "halide-${builtins.replaceStrings ["_"] ["."] version}";
llvmPackages.stdenv.mkDerivation rec {
pname = "halide";
version = "10.0.0";
src = fetchFromGitHub {
owner = "halide";
repo = "Halide";
rev = "release_${version}";
sha256 = "09xf8v9zyxx2fn6s1yzjkyzcf9zyzrg3x5vivgd2ljzbfhm8wh7n";
rev = "v${version}";
sha256 = "0il71rppjp76m7zd420siidvhs76sqiq26h60ywk812sj9mmgxj6";
};
patches = [ ./nix.patch ];
# clang fails to compile intermediate code because
# of unused "--gcc-toolchain" option
postPatch = ''
sed -i "s/-Werror//" src/CMakeLists.txt
'';
cmakeFlags = [ "-DWARNINGS_AS_ERRORS=OFF" ];
cmakeFlags = [ "-DWARNINGS_AS_ERRORS=OFF" "-DWITH_PYTHON_BINDINGS=OFF" ];
# To handle the lack of 'local' RPATH; required, as they call one of
# their built binaries requiring their libs, in the build process.
preBuild = ''
export LD_LIBRARY_PATH="$(pwd)/lib''${LD_LIBRARY_PATH:+:}$LD_LIBRARY_PATH"
export LD_LIBRARY_PATH="$(pwd)/src''${LD_LIBRARY_PATH:+:}$LD_LIBRARY_PATH"
'';
enableParallelBuilding = true;
# Note: only openblas and not atlas part of this Nix expression
# see pkgs/development/libraries/science/math/liblapack/3.5.0.nix
# to get a hint howto setup atlas instead of openblas
@ -44,24 +45,11 @@ in llvmPackages.stdenv.mkDerivation {
nativeBuildInputs = [ cmake ];
# No install target for cmake available.
# Calling install target in Makefile causes complete rebuild
# and the library rpath is broken, because libncursesw.so.6 is missing.
# Another way is using "make halide_archive", but the tarball is not easy
# to disassemble.
installPhase = ''
find
mkdir -p "$out/lib" "$out/bin"
cp bin/HalideTrace* "$out/bin"
cp lib/libHalide.so "$out/lib"
cp -r include "$out"
'';
meta = with lib; {
description = "C++ based language for image processing and computational photography";
homepage = "https://halide-lang.org";
license = licenses.mit;
platforms = [ "i686-linux" "x86_64-linux" ];
platforms = [ "i686-linux" "x86_64-linux" "aarch64-linux" ];
maintainers = [ maintainers.ck3d ];
};
}

View File

@ -1,56 +0,0 @@
diff --git a/CMakeLists.txt b/CMakeLists.txt
index 4ba384324..7e23038f7 100644
--- a/CMakeLists.txt
+++ b/CMakeLists.txt
@@ -75,10 +75,10 @@ set(CMAKE_RUNTIME_OUTPUT_DIRECTORY "${CMAKE_BINARY_DIR}/bin")
set(LLVM_VERSION "${LLVM_VERSION_MAJOR}${LLVM_VERSION_MINOR}")
-file(TO_NATIVE_PATH "${LLVM_TOOLS_BINARY_DIR}/llvm-as${CMAKE_EXECUTABLE_SUFFIX}" LLVM_AS)
-file(TO_NATIVE_PATH "${LLVM_TOOLS_BINARY_DIR}/llvm-nm${CMAKE_EXECUTABLE_SUFFIX}" LLVM_NM)
-file(TO_NATIVE_PATH "${LLVM_TOOLS_BINARY_DIR}/clang${CMAKE_EXECUTABLE_SUFFIX}" CLANG)
-file(TO_NATIVE_PATH "${LLVM_TOOLS_BINARY_DIR}/llvm-config${CMAKE_EXECUTABLE_SUFFIX}" LLVM_CONFIG)
+find_program(LLVM_AS llvm-as HINTS ${LLVM_TOOLS_BINARY_DIR})
+find_program(LLVM_NM llvm-nm HINTS ${LLVM_TOOLS_BINARY_DIR})
+find_program(CLANG clang HINTS ${LLVM_TOOLS_BINARY_DIR})
+find_program(LLVM_CONFIG llvm-config HINTS ${LLVM_TOOLS_BINARY_DIR})
# LLVM doesn't appear to expose --system-libs via its CMake interface,
# so we must shell out to llvm-config to find this info
diff --git a/apps/linear_algebra/CMakeLists.txt b/apps/linear_algebra/CMakeLists.txt
index 132c80e6a..36ce865f2 100644
--- a/apps/linear_algebra/CMakeLists.txt
+++ b/apps/linear_algebra/CMakeLists.txt
@@ -26,7 +26,7 @@ if (CBLAS_FOUND)
# Atlas requires also linking against its provided libcblas for cblas symbols
set(ATLAS_EXTRA_LIBS cblas) # XXX fragile
set(OpenBLAS_EXTRA_LIBS)
- set(BLAS_VENDORS OpenBLAS ATLAS)
+ set(BLAS_VENDORS OpenBLAS)
# TODO
# there are more vendors we could add here that support the cblas interface
@@ -41,6 +41,7 @@ if (CBLAS_FOUND)
message(STATUS " ${BLAS_VENDOR}: Missing")
else()
message(STATUS " ${BLAS_VENDOR}: Found")
+ set(BLAS_LIBRARIES "${BLAS_LIBRARIES}" CACHE FILEPATH "BLAS library to use")
list(APPEND BLAS_VENDORS ${NAME})
endif()
endforeach()
diff --git a/apps/linear_algebra/tests/CMakeLists.txt b/apps/linear_algebra/tests/CMakeLists.txt
index cc02eb0a4..c20419a0d 100644
--- a/apps/linear_algebra/tests/CMakeLists.txt
+++ b/apps/linear_algebra/tests/CMakeLists.txt
@@ -19,7 +19,7 @@ target_compile_options(test_halide_blas PRIVATE -Wno-unused-variable)
target_link_libraries(test_halide_blas
PRIVATE
halide_blas
- cblas # XXX fragile
+ ${BLAS_LIBRARIES}
${HALIDE_COMPILER_LIB}
)
--
2.23.0

View File

@ -2,13 +2,13 @@
stdenv.mkDerivation rec {
pname = "owl-lisp";
version = "0.1.19";
version = "0.1.23";
src = fetchFromGitLab {
owner = "owl-lisp";
repo = "owl";
rev = "v${version}";
sha256 = "1bgjd2gkr5risfcc401rlr5fc82gwm4r2gpp9gzkg9h64acivkjx";
sha256 = "1indcbicqcdlk9sinkdyhk50fi1b4cb7yrr14vr874gjzmwr2l3i";
};
nativeBuildInputs = [ which ];

View File

@ -29,7 +29,7 @@ let
, embedSupport ? false
, ipv6Support ? true
, systemdSupport ? stdenv.isLinux
, valgrindSupport ? true
, valgrindSupport ? !stdenv.isDarwin
, ztsSupport ? apxs2Support
}@args:
let

View File

@ -46,7 +46,7 @@ in
stdenv.mkDerivation rec {
pname = "racket";
version = "7.8"; # always change at once with ./minimal.nix
version = "7.9"; # always change at once with ./minimal.nix
src = (stdenv.lib.makeOverridable ({ name, sha256 }:
fetchurl {
@ -55,7 +55,7 @@ stdenv.mkDerivation rec {
}
)) {
name = "${pname}-${version}";
sha256 = "19z3dayybcra277s4gk2mppalwawd93f2b16xyrb6d7rbbfz7j9j";
sha256 = "18pz6gjzqy6a62xkcmjanhr7kgxpvpmc0blrk4igz8ldcybz44if";
};
FONTCONFIG_FILE = fontsConf;
@ -72,7 +72,7 @@ stdenv.mkDerivation rec {
preConfigure = ''
unset AR
for f in src/lt/configure src/cs/c/configure src/racket/src/string.c; do
for f in src/lt/configure src/cs/c/configure src/bc/src/string.c; do
substituteInPlace "$f" --replace /usr/bin/uname ${coreutils}/bin/uname
done
mkdir src/build

View File

@ -5,7 +5,7 @@ racket.overrideAttrs (oldAttrs: rec {
name = "racket-minimal-${oldAttrs.version}";
src = oldAttrs.src.override {
inherit name;
sha256 = "0bbglf9vfacpm2hn3lskhvc8cpg6z088fbnzpqsn17z8qdk8yvb3";
sha256 = "0xvnd7afx058sg7j51bmbikqgn4sl0246nkhr8zlqcrbr3nqi6p4";
};
meta = oldAttrs.meta // {

View File

@ -1,11 +1,11 @@
{ stdenv, fetchurl }:
stdenv.mkDerivation rec {
name = "lombok-1.18.12";
name = "lombok-1.18.16";
src = fetchurl {
url = "https://projectlombok.org/downloads/${name}.jar";
sha256 = "01jl6i5wzjxyk36fcq6ji90x9h143gvnwhv86cbkqaxhxh41af29";
sha256 = "1msys7xkaj0d7fi112fmb2z50mk46db58agzrrdyimggsszwn1kj";
};
buildCommand = ''

View File

@ -4,7 +4,7 @@
}:
let
version = "1.9.0";
version = "1.9.1";
in
mkDerivation {
@ -24,7 +24,7 @@ mkDerivation {
};
src = fetchurl {
url = "https://github.com/KDAB/KDSoap/releases/download/kdsoap-${version}/kdsoap-${version}.tar.gz";
sha256 = "0a28k48cmagqxhaayyrqnxsx1zbvw4f06dgs16kl33xhbinn5fg3";
sha256 = "09rxx7h98niawz7i94g279c2rgh7xmq1hqxwlyzwsaqsx4kfl850";
};
outputs = [ "out" "dev" ];
nativeBuildInputs = [ cmake ];

View File

@ -11,11 +11,11 @@
stdenv.mkDerivation rec {
pname = "libfilezilla";
version = "0.24.1";
version = "0.25.0";
src = fetchurl {
url = "https://download.filezilla-project.org/${pname}/${pname}-${version}.tar.bz2";
sha256 = "sha256-/dW07hkWr3sdQC591GfwXfdiS7ZfuVoIdaA3EuzC1v0=";
sha256 = "0akvki7n5rwmc52wss25i3h4nwl935flhjypf8dx3lvf4jszxxiv";
};
nativeBuildInputs = [ autoreconfHook pkgconfig ];

View File

@ -3,11 +3,11 @@ let
s = # Generated upstream information
rec {
baseName="libmwaw";
version="0.3.16";
version="0.3.17";
name="${baseName}-${version}";
hash="0s0qvrmxzs8wv4304p7zx9mrasglyaszafqrfmaxwyr9lpdrwqqc";
url="mirror://sourceforge/libmwaw/libmwaw/libmwaw-0.3.16/libmwaw-0.3.16.tar.xz";
sha256="0s0qvrmxzs8wv4304p7zx9mrasglyaszafqrfmaxwyr9lpdrwqqc";
hash="074ipcq9w7jbd5x316dzclddgia2ydw098ph9d7p3d713pmkf5cf";
url="mirror://sourceforge/libmwaw/libmwaw/libmwaw-0.3.17/libmwaw-0.3.17.tar.xz";
sha256="074ipcq9w7jbd5x316dzclddgia2ydw098ph9d7p3d713pmkf5cf";
};
nativeBuildInputs = [ pkgconfig ];

View File

@ -3,13 +3,13 @@
stdenv.mkDerivation rec {
pname = "libqalculate";
version = "3.13.0";
version = "3.14.0";
src = fetchFromGitHub {
owner = "qalculate";
repo = "libqalculate";
rev = "v${version}";
sha256 = "0nd0hrnp0a9p7hy6l6s45kfd267r7qg91aqn8g2dyam5hngskayk";
sha256 = "1j4sr9s7152xmci677pnz64spv8s3ia26fbp5cqx8ydv7swlivh2";
};
outputs = [ "out" "dev" "doc" ];

View File

@ -1,14 +1,14 @@
{ stdenv, fetchFromGitHub, cmake, ninja, zlib, expat, rpm, db }:
stdenv.mkDerivation rec {
version = "0.7.14";
version = "0.7.16";
pname = "libsolv";
src = fetchFromGitHub {
owner = "openSUSE";
repo = "libsolv";
rev = version;
sha256 = "10klbgknl2njbjl4k0l50ii7afwqrl1691ar4ry3snmc8chb1z7g";
sha256 = "1arw7p83s5yq36pw94b76zfiqsh90fjjnayni54a5wgm0zvbkki8";
};
cmakeFlags = [

View File

@ -1,9 +1,9 @@
{stdenv, fetchurl}:
stdenv.mkDerivation rec {
version = "5.1.1";
version = "5.1.2";
src = fetchurl {
url = "mirror://gnu/osip/libosip2-${version}.tar.gz";
sha256 = "0kgnxgzf968kbl6rx3hjsfb3jsg4ydgrsf35gzj319i1f8qjifv1";
sha256 = "148j1i0zkwf09qdpk3nc5sssj1dvppw7p0n9rgrg8k56447l1h1b";
};
pname = "libosip2";

View File

@ -75,6 +75,7 @@
, "eslint_d"
, "expo-cli"
, {"fast-cli": "1.x"}
, "fauna-shell"
, "fkill-cli"
, "forever"
, "fx"

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,28 @@
{ lib, fetchurl, buildDunePackage
, cstruct, logs, lwt, mirage-flow
, alcotest, mirage-flow-combinators
}:
buildDunePackage rec {
pname = "mirage-channel";
version = "4.0.1";
useDune2 = true;
src = fetchurl {
url = "https://github.com/mirage/mirage-channel/releases/download/v${version}/mirage-channel-v${version}.tbz";
sha256 = "0wmb2zhiyp8n78xgcspcsyd19bhcml3kyli2caw3778wc1gyvfpc";
};
propagatedBuildInputs = [ cstruct logs lwt mirage-flow ];
doCheck = true;
checkInputs = [ alcotest mirage-flow-combinators ];
meta = {
description = "Buffered channels for MirageOS FLOW types";
license = lib.licenses.isc;
maintainers = [ lib.maintainers.vbgl ];
homepage = "https://github.com/mirage/mirage-channel";
};
}

View File

@ -4,11 +4,11 @@ buildDunePackage rec {
minimumOCamlVersion = "4.08";
pname = "mirage-crypto";
version = "0.8.6";
version = "0.8.7";
src = fetchurl {
url = "https://github.com/mirage/mirage-crypto/releases/download/v${version}/mirage-crypto-v${version}.tbz";
sha256 = "1fghg89lpm1iks6nk1jhqcimpvb52jih0ys9bxbn2f343l0njbkq";
sha256 = "1gx86h6kk39zq3kvl854jc2ap2755paalp1f7iv8r9js2xnbxfxy";
};
useDune2 = true;

View File

@ -1,14 +1,14 @@
{ mkDerivation, fetchurl, pkgs, lib, php }:
let
pname = "composer";
version = "2.0.0";
version = "2.0.4";
in
mkDerivation {
inherit pname version;
src = fetchurl {
url = "https://getcomposer.org/download/${version}/composer.phar";
sha256 = "11fjplbrscnw0fs5hmw4bmszg5a87ig189175407i1ip5fm5g5hk";
sha256 = "03bnaifywh8pmpzl0b8r3rm3radj0rz176vzkz327j99fi3vrcn3";
};
dontUnpack = true;

View File

@ -1,14 +1,14 @@
{ mkDerivation, fetchurl, pkgs, lib, php }:
let
pname = "phpstan";
version = "0.12.51";
version = "0.12.52";
in
mkDerivation {
inherit pname version;
src = pkgs.fetchurl {
url = "https://github.com/phpstan/phpstan/releases/download/${version}/phpstan.phar";
sha256 = "0pfy14c0r64hdzlq5x1w225za2566s8vhh4hnfasmfh52s7v77p4";
sha256 = "0zhbpcja7fyhqi2p8mky7v3dv50dgi4yxpj2hvmxs61kp9irf0nb";
};
phases = [ "installPhase" ];

View File

@ -1,14 +1,14 @@
{ mkDerivation, fetchurl, pkgs, lib, php }:
let
pname = "psalm";
version = "3.11.2";
version = "4.1.0";
in
mkDerivation {
inherit pname version;
src = fetchurl {
url = "https://github.com/vimeo/psalm/releases/download/${version}/psalm.phar";
sha256 = "1ani0907whqy2ycr01sjlvrmwps4dg5igim8z1qyv8grhwvw6gb0";
sha256 = "1mpbw9q0fgh6vdfwsm222fz1vi9jrw6l5k1mz4gyv5kxvbyzmn4c";
};
phases = [ "installPhase" ];

View File

@ -8,14 +8,14 @@
}:
buildPythonPackage rec {
version = "0.15.1";
version = "0.15.2";
pname = "authlib";
src = fetchFromGitHub {
owner = "lepture";
repo = "authlib";
rev = "v${version}";
sha256 = "0jh4kdi5spzhmgvq3ffz2q467hjycz3wg97f7n53rffiwd86jrh5";
sha256 = "0jsqh2nirx3xifsakqdpd3wpdig6czavv3yj4lyqz3wh9xjpvswg";
};
propagatedBuildInputs = [ cryptography requests ];

View File

@ -5,12 +5,12 @@
buildPythonPackage rec {
pname = "azure-appconfiguration";
version = "1.1.0";
version = "1.1.1";
src = fetchPypi {
inherit pname version;
extension = "zip";
sha256 = "0mv053vl88nzpv701gnjdmbylc8qm0kkq87264rfhvrx3ydymf97";
sha256 = "b83cd2cb63d93225de84e27abbfc059212f8de27766f4c58dd3abb839dff0be4";
};
propagatedBuildInputs = [

View File

@ -14,14 +14,14 @@
}:
buildPythonPackage rec {
version = "1.8.1";
version = "1.8.2";
pname = "azure-core";
disabled = isPy27;
src = fetchPypi {
inherit pname version;
extension = "zip";
sha256 = "7efbeac3a6dfb634cb5323bc04e18ab609aeab6b03610808091aa0517373d626";
sha256 = "621b53271f7988b766f8a7d7f7a2c44241e3d2c1d8db13e68089d6da6241748e";
};
propagatedBuildInputs = [

View File

@ -11,11 +11,11 @@
buildPythonPackage rec {
pname = "azure-datalake-store";
version = "0.0.50";
version = "0.0.51";
src = fetchPypi {
inherit pname version;
sha256 = "9b9b58dcf1d0d0e5aa499d5cb49dcf8f5432ca467a747b39167bb70ef901dbc2";
sha256 = "b871ebb3bcfd292e8a062dbbaacbc132793d98f1b60f549a8c3b672619603fc1";
};
propagatedBuildInputs = [

View File

@ -17,13 +17,13 @@
buildPythonPackage rec {
pname = "azure-identity";
version = "1.4.0";
version = "1.4.1";
disabled = isPy38;
src = fetchPypi {
inherit pname version;
extension = "zip";
sha256 = "820e1f3e21f90d36063239c6cb7ca9a6bb644cb120a6b1ead3081cafdf6ceaf8";
sha256 = "7b071089faf0789059ac24052e311e2b096a002c173d42b96896db09c6e2ba5d";
};
propagatedBuildInputs = [

View File

@ -10,11 +10,11 @@
buildPythonPackage rec {
pname = "azure-mgmt-billing";
version = "0.2.0"; #pypi's 0.2.0 doesn't build ootb
version = "1.0.0"; #pypi's 0.2.0 doesn't build ootb
src = fetchPypi {
inherit pname version;
sha256 = "1li2bcdwdapwwx7xbvgfsq51f2mrwm0qyzih8cjhszcah2rkpxw5";
sha256 = "8b55064546c8e94839d9f8c98e9ea4b021004b3804e192bf39fa65b603536ad0";
extension = "zip";
};
@ -25,7 +25,7 @@ buildPythonPackage rec {
];
preBuild = ''
rm azure_bdist_wheel.py
rm -rf azure_bdist_wheel.py
substituteInPlace setup.cfg \
--replace "azure-namespace-package = azure-mgmt-nspkg" ""
'';

View File

@ -8,12 +8,12 @@
buildPythonPackage rec {
pname = "azure-mgmt-cognitiveservices";
version = "6.2.0";
version = "6.3.0";
src = fetchPypi {
inherit pname version;
extension = "zip";
sha256 = "93503507ba87c18fe24cd3dfcd54e6e69a4daf7636f38b7537e09cee9a4c13ce";
sha256 = "1d029d5140152a36cc32f340e09f2b185ede2f54e577a44f3821878efb823415";
};
propagatedBuildInputs = [

View File

@ -10,12 +10,12 @@
buildPythonPackage rec {
pname = "azure-mgmt-containerservice";
version = "9.4.0";
version = "10.0.0";
src = fetchPypi {
inherit pname version;
extension = "zip";
sha256 = "d90684106c70779450b82067be4d3e449c799ca1f47d941e45f6d2b5c016dac9";
sha256 = "9b44b2d0b281fc1999324a715fb5cf4f47d392a35bc0a01f24bb8dbc4c123acd";
};
propagatedBuildInputs = [

View File

@ -7,13 +7,13 @@
}:
buildPythonPackage rec {
version = "1.2.0";
version = "1.2.1";
pname = "azure-mgmt-core";
src = fetchPypi {
inherit pname version;
extension = "zip";
sha256 = "8fe3b59446438f27e34f7b24ea692a982034d9e734617ca1320eedeee1939998";
sha256 = "a3906fa77edfedfcc3229dc3b69489d5ed63b107c7eacbc50092e6cbfbfd83f0";
};
propagatedBuildInputs = [

View File

@ -10,12 +10,12 @@
buildPythonPackage rec {
pname = "azure-mgmt-datafactory";
version = "0.13.0";
version = "0.14.0";
src = fetchPypi {
inherit pname version;
extension = "zip";
sha256 = "3eabe34f49587840617747511a0aa25d08b0c1619677a1fdda24ce9e6b0f2c74";
sha256 = "47bb23fa6cc28f3f15dd7a404b7f9d7af469adf78f79a11eb01ff75ca10a75ba";
};
propagatedBuildInputs = [

View File

@ -5,13 +5,13 @@
}:
buildPythonPackage rec {
version = "1.7.0";
version = "2.0.0";
pname = "azure-mgmt-hdinsight";
disabled = isPy27;
src = fetchPypi {
inherit pname version;
sha256 = "9d1120bd9760687d87594ec5ce9257b7335504afbe55b3cda79462c1e07a095b";
sha256 = "fd47029f2423e45ec4d311f651dc972043b98e960f186f5c6508c6fdf6eb2fe8";
extension = "zip";
};

Some files were not shown because too many files have changed in this diff Show More