qrdn

quite random domain name

Jul 29, 2023

The Nightmare Stacks map

I very much enjoyed "The Nightmare Stacks" by Charles Stross, my favourite book of the "Laundry Files" series (as released so far).

The plot being placed in a real place, I used the excellent umap website to draw the places & campaign movements named in the book as an openstreetmap overlay (actually 3 layers: places & leylines from the preparation phase, second the campaign of the Host of Air and Darkness, third the counter-movements by the Urük):

See full screen

Jan 06, 2023

checkmk setup details (inside podman container)

Things about CheckMK not (easily) found in the documentation.

Every host is pinged, this service is named "Check_MK".

Run containerized version in podman with podman --cap-add net_raw for ping to work

Install Plugins in containerized Raw edition:

$ podman exec -it -u cmk  checkmk  /bin/bash
OMD[cmk]:~$ cd /tmp
OMD[cmk]:~$ curl https://raw.githubusercontent.com/f-zappa/check_mk_adsl_line/master/adsl_line-1.3.mkp
OMD[cmk]:~$ mkp install adsl_line-1.3.mkp

Monitoring RHEL7 - has systemd v219, the agent requires 220. remedy: "legacy mode" - avoid xinetd? use Agent over SSH - ssh known_hosts in checkmk container: run cmk --check $hostname which lets "the correct" ssh ask for accepting the host key. podman -u cmk $containername /bin/ssh $hostname did not suffice in my case

IPv6

checkmk defaults to IPv4 only, change in Settings > Folder "Main" > Dual stack. Selecting "both" will do ping4 and ping6 checks separately.

podman container gets no public route, only link-local (fe80:), thus IPv6 pings to public addresses fail. Apparently an open issue: https://github.com/containers/podman/issues/15850.

Dez 27, 2022

Reading BTLE advertisements on Linux

Project context: reading environmental sensor data from card10 into influxDB.

GATT / ESS

epicardium (C Firmware) exports sensor data over Bluetooth GATT ESS (environmental sensing service).

GATT requires pairing, connecting → only 1:1, encrypted

Reading possible with card10_iaq_notify.py from https://git.card10.badge.events.ccc.de/card10/firmware/-/merge_requests/508/diffs

Turns out the eCO2 values computed by the BSEC values are quite bogus, compared to a proper CO2 NDIR sensor: https://dm1cr.de/co2-sensor-vergleich. I replicated these results:

Correlate BME680 with MH-Z19c

Now the idea is to check if the raw resistance values from the BME680 have some better correlation to actual CO2 values, without the intelligence from the proprietary BSEC library.

Epicardium has these values, but doesn't export these over bluetooth ESS. So instead, use a python script to do BT-LE advertisements ourselves: https://codeberg.org/scy/sensible-card10/ And extend it to include the raw resistance value.

reading GAP / BLE advertisements on linux

This is a real problem due to lack of documentation of bluez5. Most online searches turn up the deprecated hcitool and friends. There is a python library bluepy which implements BLE advertisement scanning, but docs are limited too: https://ianharvey.github.io/bluepy-doc/. Some blogpost mentions the backend binary bluepy-helper which can also be talked to directly: https://macchina.io/blog/internet-of-things/communication-with-low-energy-bluetooth-devices-on-linux/

GAP ~= beacons, limited data rate, 1:n, unencrypted. See https://elinux.org/images/3/32/Doing_Bluetooth_Low_Energy_on_Linux.pdf, https://learn.adafruit.com/introduction-to-bluetooth-low-energy?view=all

Check source is sending advertisements (card10 serial, sudo picocom -b 115200 /dev/ttyACM0 had python stacktrace in my case).

bluetooth snooping using bluez' btmon: no filtering, e.g. A2DP clutters.

Better: the reference tool blescan from bluepy:

$ sudo blescan -a -t0
    Device (new): ca:4d:10:XX:XX:XX (public), -70 dBm
        16b Service Data: <1a18c2cccc071b16187e0f00b100016c00860260>
        Complete Local Name: 'card10'

Copy implementation to implement own data parsing:

from bluepy import btle

class ScanPrint(btle.DefaultDelegate):
    def handleDiscovery(self, dev: btle.ScanEntry, isNewDev: bool, isNewData: bool):
        magic, version, temp, hum, press, gr, iaqa, iaq, eco2, battery = \
            struct.unpack('<HHhHLhBHHB', dev.getValue(btle.ScanEntry.SERVICE_DATA_16B))
        # ScanEntry.getValueText() parses bytes into hex-digits, but struct.unpack() needs bytes from getValue

Dez 15, 2022

Merge multiple PDF pages into one page

Example scan a Credit Card, resulting in a PDF with two pages, each containing the image of one side of the card.

Join the images, put them onto one page (and add some 10px distance between them):

pdfjam input.pdf --nup 1x2 --noautoscale=true --fitpaper=true --delta "10 10" --outfile out.pdf

Aug 25, 2022

plasmashell stuck at 100% CPU

My fedora 35 had its plasmashell process stuck at 100% CPU, after some switching between wayland and X11 sessions. When trying to change global keyboard shortcuts (e.g. Alt-F1 for the App-Menu Launcher), it would freeze completely for like 30 seconds, repeatedly. A new user wouldn't have that problem.

The fix was to remove ~/.config/plasmashellrc.

Jun 24, 2018

Feb 18, 2018

htc HD2 Startup sequence // howto update Bootloader

Source: https://forum.xda-developers.com/showthread.php?t=1402975

Startup

sequence

  1. HSPL "tri colored-screen"
  2. Bootloader: MAGLDR, cLK
  3. one of
    • recovery: CWM, TWRP, ...
    • OS (Android, Windows, ...)

Magic Buttons at Startup:

  1. HSPL: keep Volume down and press End call/Power once
  2. MAGLDR: keep End call/Power

Update Bootloader

(with broken Start call button):

  • place leoimg.nbh on external SD card (FAT32 partition)
  • start into HSPL
  • confirm update by pressing End call/Power

USB commands:

Source: https://forum.xda-developers.com/showthread.php?t=1292146

  • fastboot flash recovery $recovery.img
  • fastboot oem boot-recovery boot into recovery
  • Show help, which tells how to emulate hardware keys: fastboot oem \?

Sep 19, 2016

How to save a whole mediawiki into a git repo

Recently I tried to archive the contents of our old MediaWiki instance to a git repository. Somebody else had already done that, using some scripts from MediaWiki, but these ignored the page histories, saving only the recent version for each page, and offered no possibility to also save uploaded files, especially images.

So I decided to see if I could do better, and found Git-Mediawiki. I had to fiddle a bit, because of it not being installed, only copied, with archlinux' git package, and our broken TLS certificate, but eventually got the import to work:

pacman -Sy perl-mediawiki-api perl-datetime-format-iso8601 perl-lwp-protocol-https
sudo ln -s /usr/share/git/mw-to-git/git-mw.perl /usr/lib/git-core/git-mw
sudo ln -s /usr/share/git/mw-to-git/git-remote-mediawiki.perl /usr/lib/git-core/git-remote-mediawiki
export PERL5LIB=/usr/share/git/mw-to-git/
export PERL_LWP_SSL_VERIFY_HOSTNAME=0  # this makes the whole TLS encryption insecure -- I use it because we don't have a valid certificate, and I don't intend to write back to the wiki
git clone mediawiki::https://wiki.chaos-darmstadt.de/w

The result is a linear history with one commit for each saved revision of any page. There seem to be some bugs, though: - subpages are not exported, like our main pages' subsections "Hauptseite/Header" etc. - some page histories occur twice in the git history, e.g. for page "Mate-Basteln"


Some of the things I did wrong when getting it to work:

Errors

  1. wrong endpoint https://wiki.chaos-darmstadt.de/

    fatal: could not get the list of wiki pages.
    fatal: 'https://wiki.chaos-darmstadt.de/' does not appear to be a mediawiki
    fatal: make sure 'https://wiki.chaos-darmstadt.de//api.php' is a valid page
    fatal: and the SSL certificate is correct.
    fatal: (error 2: 404 Not Found : error occurred when accessing https://wiki.chaos-darmstadt.de//api.php after 1 attempt(s))
    fatal: Could not read ref refs/mediawiki/origin/master
    
  2. Wrong endpoint https://wiki.chaos-darmstadt.de/wiki/

    Searching revisions...
    No previous mediawiki revision found, fetching from beginning.
    Fetching & writing export data by pages...
    Listing pages on remote wiki...
    fatal: could not get the list of wiki pages.
    fatal: 'https://wiki.chaos-darmstadt.de/wiki/' does not appear to be a mediawiki
    fatal: make sure 'https://wiki.chaos-darmstadt.de/wiki//api.php' is a valid page
    fatal: and the SSL certificate is correct.
    fatal: (error 2: Failed to decode JSON returned by https://wiki.chaos-darmstadt.de/wiki//api.php
    Decoding Error:
    malformed JSON string, neither tag, array, object, number, string or atom, at character offset 0 (before "<!DOCTYPE html>\n<ht...") at /usr/share/perl5/vendor_perl/MediaWiki/API.pm line 400.
    
    Returned Data:
    <!DOCTYPE html>
    <html lang="de" dir="ltr" class="client-nojs">
    <head>
    <meta charset="UTF-8" />
    <title>Diese Aktion gibt es nicht â Chaos-Darmstadt Wiki</title>
    
    ... (all the HTML from the page) ...
    
    fatal: Could not read ref refs/mediawiki/origin/master
    
  3. PERL5LIB not set

    Klone nach 'wiki' ...
    Can't locate Git/Mediawiki.pm in @INC (you may need to install the Git::Mediawiki module) (@INC contains: /usr/lib/perl5/site_perl /usr/share/perl5/site_perl /usr/lib/perl5/vendor_perl /usr/share/perl5/vendor_perl /usr/lib/perl5/core_perl /usr/share/perl5/core_perl .) at /usr/lib/git-core/git-remote-mediawiki line 18.
    BEGIN failed--compilation aborted at /usr/lib/git-core/git-remote-mediawiki line 18.
    
  4. SSL/TLS cert not accepted. I work around by disabling the check, because I know the cert is broken, and I don't intend to write back to the wiki, so in the worst case my export attempt is tampered with. In general, always correctly check your certificates and treat this as an severe error!

    Searching revisions...
    No previous mediawiki revision found, fetching from beginning.
    Fetching & writing export data by pages...
    Listing pages on remote wiki...
    fatal: could not get the list of wiki pages.
    fatal: 'https://wiki.chaos-darmstadt.de/wiki/' does not appear to be a mediawiki
    fatal: make sure 'https://wiki.chaos-darmstadt.de/wiki//api.php' is a valid page
    fatal: and the SSL certificate is correct.
    fatal: (error 2: 500 Can't connect to wiki.chaos-darmstadt.de:443 (certificate verify failed) : error occurred when accessing https://wiki.chaos-darmstadt.de/wiki//api.php after 1 attempt(s))
    fatal: Could not read ref refs/mediawiki/origin/master
    
  5. git remote helper mediawiki not installed (ln -s commands from above):

    fatal: Unable to find remote helper for 'mediawiki'
    

Sep 14, 2016

Qt "ModelTest" application

Several sources on the Internet propose the ModelTest application to check whether custom QAbstractItemModel subclasses behave correctly. The application is contained in the official sources (Qt5 and Qt4), and documented (in a sense) in the official wiki. What is not clear, however:

  • The mentioned .pri file is missing, one version can be found on github
  • Although not stated clearly, ModelTest is not something you can drop into your otherwise normal application, comparable to Q_ASSERT statements. Instead, it's a unit test for the QTest framework, i.e. a standalone test of a separate model instance. If your model proxies a complex data structure which can not easily be recreated or mocked for the test, ModelTest is useless.
  • Various versions of ModelTest are in the Qt repos, at KDE, and elsewhere. The ones in the Qt repositories got not many updates recently (last commit from 2016-01-21), so it's not clear if they are maintained at all

Mär 16, 2015

Feb 13, 2015

debugging ANTLR4 Lexer

grun MyLexer tokens -tokens < testfile

invokes the TestRig on the Lexer spilling out the tokens it recognized. Example stdout:

[@0,0:9='google.com',<4>,1:0]
[@1,10:10='\n',<2>,1:10]
[@2,11:11='\t',<1>,2:0]

Format of this output: A list of tokens, where each is:

[@tokenIndex,startIndex:endIndex="spelling",<tokenId>,?:?]

or (if not default channel)

[@tokenIndex,startIndex:endIndex="spelling",<tokenId>,channel=channelId,lineNo:columnNo]
  • tokenIndex - in the whole output, starting at 0
  • startIndex,endIndex - char/byte? in the input stream
  • spelling - the literal text
  • tokenId - can be found in the .tokens file
  • channelId - index of the channel(?)
  • lineNo,columnNo - line, column of the token start

Tip: append | column -t -s, | less to create a table delimited at , and increase readability (and pass through less for paging).

This does not output "sub-tokens", i.e. only the highest level, not the ones these are assembled from.

Nov 02, 2014

grep -f

grep -f patterns.txt is horribly slow. Even for l in cat patterns.txt ; do grep $l ; done is magnitudes faster.

Aug 15, 2014

Aug 05, 2014

TIL - htop graphs

Today I learned: htop can show value history of the meters in the top bar: Go to configure them, then select any which has a [mode] in square brackets behind it - F4 toggles that mode, including live preview of plain number view, ascii-graphed history, and "LED" which are big ascii-arted numbers.

Jul 15, 2014

Jul 15, 2014

Lol, Unicode…

Some Unicode findings:

  • 📳 U+1F4F3 : VIBRATION MODE
  • 🚂 U+1F682 : STEAM LOCOMOTIVE
  • 📢 U+1F4E2 : PUBLIC ADDRESS LOUDSPEAKER
  • 🍝 U+1F35D : SPAGHETTI

Mai 01, 2013

Migrating a router from DD-WRT to OpenWRT

How to exchange the DD-WRT firmware on your router by OpenWRT.


my Router: TL-WR1043ND Version (DE)v1.0

Old firmware: DD-WRT r19519

New firmware: OpenWRT "Attitude Adjustment" 12.09 Beta 2, filename: openwrt-ar71xx-generic-tl-wr1043nd-v1-squashfs-factory.bin


First revert back to the original TP-Link firmware[1]. Use the Web flash interface and a special image[2]. This took very long in my case, first the 200sec in the browser counted down, and then some minutes passed where nothing happened. Finally the Browser showed a confirmation dialog reading "Update failed", but the router was unresponsive (no ping, no dhcp). I did a powercycle on it, then it was up and running fine, with the original firmware reading:

3.11.5 Build 100427 Rel.61427n
Hardware Version:   
WR1043N v1 00000000

Then I simply put the openwrt /factory/ file from above into the Web flash formular and uploaded. Took some time, then OpenWRT was up and the LuCI Web interface asked to set a root password.


[1]: got this instruction from http://wiki.openwrt.org/doc/howto/generic.flashing#via.original.firmware

[2]: from here http://www.dd-wrt.com/phpBB2/viewtopic.php?t=85237

Feb 17, 2013

quick'n'dirty PXE

… on Archlinux:

wget -O /var/ftpd/ipxe.pxe http://releng.archlinux.org/pxeboot/
pacman -S tftp-hpa
sudo in.tftpd -L

append to dnsmasq-config on your dhcp-server:

dhcp-boot=/var/ftpd/ipxe.pxe,,<ip-of-tftp-server>

Feb 04, 2013

E135 backlight

Disclaimer: this article is rather old, and does not reflect linux' current state of brightness on the Thinkpad E135

first, it was fixed on maximum. no tool could change it, /sys/class/backlight/acpi_video0/brightness could be written to and changed, but the backlight didn't. Tried the solutions in1, excluding those which would just echo to above /sys/... file, until I found out Kernel parameter acpi_backlight=vendor did the trick.

Now /sys/class/backlight no longer contained an acpi_video0 directory, but a thinkpad_screen directory, that has the same brightness files, which change on using the Fn keys for brightness change.

If I write to them their content changes, but not the backlight, but at least the Fn key combos work.

Feb 04, 2013

KMix with multiple identically named master channels

On the Thinkpad Edge E135, ALSA recognizes 2 sound cards (0 and 1), of which #1 is the analog one I want to use and control - but not the default one. alsamixer can control it via selecting the entries in its F6 menu, but still it's not the default one. Creating this1 asound.conf fixes that.

Now KMix shows 2 open tabs (both with the label "HD Audio Generic"), one for each card. Unfortunately it defaults to control the digital output with its control-panel icon, too, and worse, the dialog to change that "Master Channel" gets confused by the identical names and just doesn't display any channel, so you can't change it. Fix: Stop KMix (dunno if that's really neccessary, but probably is), open ~/.kde4/share/config/kmixrc and set the MasterMixer= and MasterMixerDevice= entries in the [Global] section. MasterMixer specifies an equivalent to the ALSA card, but apparently 1-indexed (so here it was "ALSA::HD-Audio_Generic:1" and I set it to "ALSA::HD-Audio_Generic:2"), MasterMixerDevice tells the channel which should be controlled as Master ("IEC958:0" here, changed to "Master:0"). Simply look at the end of the section names and compare with the channels displayed in KMix beyond each tab (i.e. card), if you're unsure what entries to set.

Note: In later versions of KMix (IIRC around 4.15) this stopped to work, so I stayed on that version


  1. /etc/asound.conf:

    defaults.pcm.card 1
    defaults.pcm.device 0
    defaults.ctl.card 1
    

Dez 01, 2012

32bit wine-prefixes

tl;dr: WINEARCH=win32 creates a 32bit prefix, to confirm check if $WINEPREFIX/drive_c/Program Files (x86) exists

A wine prefix is a directory containing a lot of the files wine needs to simulate a windows installation. The "standard" wine prefix ~/.wine is used whenever you don't specify a different one in the environment variable $WINEPREFIX. Now like a normal windows installation, the environment (e.g. the provided libraries) may be of either 32 or 64-bitness. Wine copies these libraries into the prefix when it is created, so you have to specify the desired bitness at that point, using the $WINEARCH environment variable. To create a 32bit prefix, just execute wine with $WINEPREFIX set, e.g.

WINEARCH=win32 WINEPREFIX=~/.wine_32bit winecfg

Finally, to check what bitness a given prefix has, just check if there is a directory drive_c/Program Files (x86) in the prefix - if there is, it's a 64bit prefix.