+repage makes IM not write the offset to the files but actually crop “properly”. Maybe pdfcrop has such parameter too?
Installing a printer in Cups via a USB->Serial adapter
Make sure to “modprobe usbserial“. dmesg should show a printer being connected if you plug it in now. Then Cups should see it too.
RTTY with SDR# and fldigi (for the german DWD stations)
In SDR#: Use USB, filter bandwidth 1000. center the RTTY in your window.
In fldigi: Op Mode -> RTTY -> Custom. Set the carrier shift to custom and then enter 450 in the custom shift field below. Baud rate: 50, 5 bits per character, no parity, 1.5 stop bits. Save and Close. Make sure the Rv button is green!
CW decoding with SDR# and fldigi
In SDR#: Use the CW-L or CW-U preset. Tune so the morse code is right in the middle of your “reception window”.
In fldigi: Op Mode -> CW. Turn off squelch by making the SQL button not green but grey.
It works very well for non-human morse for me. Radio amateur morse is harder and so far full of “spelling errors”. :)
PDF to image with imagemagick/graphicsmagick
If you want to create images from PDF files, use for example mogrify -verbose -geometry 1600 -density 300 -format png *.pdf Without a decent “-density” parameter, you will probably get a blurry image as result.
echo 0 | tee /sys/devices/system/cpu/cpufreq/boost
Might work, might not, depending on unknown factors. Whatever, I just wanted to make some all-core-but-unimportant process to run without going 95°C. For that it worked perfectly well. CPU temps of a Ryzen 5 3600 after many hours of full utilization were at ~65°C. CPU frequencies were capped to 3.6GHz with this while jumping up to 4.2GHz (and ~94°C) without.
Obviously this has an impact on performance.
To re-enable just echo a 1 instead. This is reset anyways when you reboot your system.
Then (if you really want to do it), uncomment the function call in the last line and execute the script. Follow the instructions.
To clean up remove or restore the QGIS/QGISCUSTOMIZATION3.ini file in your profile and remove the license directory from your profile, restore the previous value of UI/Customization/enabled in your profile (just remove the line or disable Settings -> Interface Customization).
If you want to hate yourself in the future, put it in a file called startup.py in QStandardPaths.standardLocations(QStandardPaths.AppDataLocation) aka the directory which contains the profiles directory itself.
BTW: If you end up with QGIS crashing and lines like these in the error output:
... Warning: QPaintDevice: Cannot destroy paint device that is being painted QGIS died on signal 11 ...
It is probably not a Qt issue that caused the crash. The QPaintDevice warning might just be Qt telling you about your painter being an issue during clean up of the actual crash (which might just be a wrong name or indentation somewhere in your code, cough).
This post’s purpose is to link “Tiardey USB Single Foot Pedal Optical Switch Control One Key Programm Computer Tastatur Maus Game Action HID” to “PCsensor” and the footswitch tool on search engines so others who wonder if the device is easy to use on Linux learn that this is the case. Hope it helps!
I bought this https://www.amazon.de/dp/B09TQFBS3C which came with a chinese/manual saying “FS2007 User Manual” and also says “FS2007U1SW (mechanical switch)” (mine clicks, so I guess it is not the “FS2007U1IR (silent photoelectric switch)”. The manual links to pcsensor.com for Windows drivers.
Plugin the device. dmesg should show something like:
[Sun Jan 8 20:25:05 2023] usb 1-4: new full-speed USB device number 7 using xhci_hcd [Sun Jan 8 20:25:05 2023] usb 1-4: New USB device found, idVendor=1a86, idProduct=e026, bcdDevice= 0.00 [Sun Jan 8 20:25:05 2023] usb 1-4: New USB device strings: Mfr=1, Product=2, SerialNumber=0 [Sun Jan 8 20:25:05 2023] usb 1-4: Product: FootSwitch [Sun Jan 8 20:25:05 2023] usb 1-4: Manufacturer: PCsensor [Sun Jan 8 20:25:06 2023] input: PCsensor FootSwitch Keyboard as /devices/pci0000:00/0000:00:14.0/usb1/1-4/1-4:1.0/0003:1A86:E026.0001/input/input19 [Sun Jan 8 20:25:06 2023] input: PCsensor FootSwitch Mouse as /devices/pci0000:00/0000:00:14.0/usb1/1-4/1-4:1.0/0003:1A86:E026.0001/input/input20 [Sun Jan 8 20:25:06 2023] input: PCsensor FootSwitch as /devices/pci0000:00/0000:00:14.0/usb1/1-4/1-4:1.0/0003:1A86:E026.0001/input/input21 [Sun Jan 8 20:25:06 2023] hid-generic 0003:1A86:E026.0001: input,hidraw0: USB HID v1.11 Keyboard [PCsensor FootSwitch] on usb-0000:00:14.0-4/input0 [Sun Jan 8 20:25:06 2023] input: PCsensor FootSwitch as /devices/pci0000:00/0000:00:14.0/usb1/1-4/1-4:1.1/0003:1A86:E026.0002/input/input22 [Sun Jan 8 20:25:06 2023] hid-generic 0003:1A86:E026.0002: input,hidraw1: USB HID v1.10 Device [PCsensor FootSwitch] on usb-0000:00:14.0-4/input1 [Sun Jan 8 20:25:06 2023] usbcore: registered new interface driver usbhid [Sun Jan 8 20:25:06 2023] usbhid: USB HID core driver [Sun Jan 8 20:25:10 2023] usb 1-4: reset full-speed USB device number 7 using xhci_hcd
Sweet, so it is just some rebranded PCsensor device.
lsusb says ID 1a86:e026 QinHeng Electronics FootSwitch btw.
footswitch -m ctrl -k 1 will configure it to send Ctrl+1 when pressed for example. See the readme for usage and more examples.
You can use more than 3 of these devices via this pull request. I have four connected via a USB hub 1a40:0101 (“Terminus Technology Inc. Hub (branded “hama”), https://www.amazon.de/dp/B08YRZT1RL) and they work just fine.
This was the final expression (with lots of opportunity to improve):
with_variable(
'point_at_top_of_canvas',
densify_by_count(
make_line(
point_n( @map_extent, 3), -- no idea if these indexes are stable
point_n( @map_extent, 4)
),
42 -- number of trajectories
),
collect_geometries(
array_foreach(
generate_series(1, num_points(@point_at_top_of_canvas)),
with_variable(
'point_n_of_top_line',
point_n(@point_at_top_of_canvas, @element),
point_n(
wave_randomized(
make_line(
@point_n_of_top_line,
-- make it at least touch the bottom of the canvas:
translate(@point_n_of_top_line, 0, -@map_extent_height)
),
-- fairly stupid frequency and wavelength but hey, works in any crs
1, @map_extent_width/5,
1, @map_extent_width/100,
seed:=@element -- stable waves \o/
),
floor(epoch(now())%10000/50) -- TODO make it loop around according to num_points of each line
)
)
)
)
)
Use it on an empty polygon layer with an inverted polygon style and set it to refresh at a high interval (0.01s?). Or use this QGIS project (I included some intermediate steps of the style as layer styles if you want to learn about this kind of stuff):
I will stream my attempt to do the #30DayMapChallenge in a day on https://live.nullisland.org/ in about 9 hours (08:00 UTC, 09:00 CET). Probably going to combine multiple categories as I only have time for ~6 hours. Tune in if you like chaos and fun.
Not sure why I never posted this last year but I did the #30DayMapChallenge in a single day, streamed live via a self-hosted Owncast instance. It was … insane and fun. This year I will do it again, on the 26th of November.
Here are most of the maps I made last year:
Some notes I kept, please bug me about recovering the others from my Twitter archive (I deleted old tweets a bit too early):
18 Water (DGM-W 2010 Unter- und Außenelbe, Wasserstraßen- und Schifffahrtsverwaltung des Bundes, http://kuestendaten.de, 2010)
20 Movement: Emojitions on a curvy trajectory. State changes depending on the curvyness ahead. Background: (C) OpenStreetMap Contributors <3
21 Elevation with qgis2threejs (It’s art, I swear!
22 Boundaries: Inspired by Command and Conquer Red Alert. Background by Spiney (CC-BY 3.0 / CC-BY-SA 3.0, https://opengameart.org/node/12098)
24 Historical: Buildings in Hamburg that were built before the war (at least to some not so great dataset). Data Lizenz: Datenlizenz Deutschland Namensnennung 2.0 (Freie und Hansestadt Hamburg, Landesbetrieb Geoinformation und Vermessung (LGV))
27 Heatmap: Outdoor advertisements (or something like that) in Hamburg. Fuck everything about that! Data Lizenz: Datenlizenz Deutschland Namensnennung 2.0 (Freie und Hansestadt Hamburg, Behörde für Verkehr und Mobilitätswende, (BVM))
28 Earth not flat. Using my colleague’s Beeline plugin to create lines between the airports I have flown too and the Globe Builder plugin by @gispofinland to make a globe.
I scraped the numbers of live readers per article published by Süddeutsche Zeitung on their website for more than 3 years, never did anything too interesting with it and just decided to stop. Basically they publish a list of stories and their estimated current concurrent number of readers. Meaning you get a timestamp -> story/URL -> number of current readers. Easy enough and interesting for sure.
Here is how it worked, some results and data for you to build upon. Loads of it is stupid and silly, this is just me dumping it publicly so I can purge it.
Database
For data storage I chose the dumbest and easiest approach because I did not care about efficiency. This was a bit troublesome later when the VPS ran out of space but … shrug … I cleaned up and resumed without changes. Usually it’s ok to be lazy. :)
So yeah, data storage: A SQLite database with two tables:
Can you spot the horrible bloating issue? Yeah, when the (same) URLs are stored again and again for each row, that gets big very quickly. Well, I was too lazy to write something smarter and more “relational”. Like this it is only marginally better than a CSV file (I used indexes on all the fields as I was playing around…). Hope you can relate. :o)
Scraping
#!/usr/bin/env python3
from datetime import datetime
from lxml import html
import requests
import sqlite3
import os
# # TODO
# - store URLs in a separate table and reference them by id, this will significantly reduce size of the db :o)
# - more complicated insertion queries though so ¯\\\_(ツ)\_/¯
# The site updates every 2 minutes, so a job should run every 2 minutes.
# # Create database if not exists
sql_initialise = """
CREATE TABLE visitors_per_url (timestamp TEXT, visitors INTEGER, url TEXT);
CREATE TABLE visitors_total (timestamp TEXT, visitors INTEGER);
CREATE INDEX idx_visitors_per_url_timestamp ON visitors_per_url(timestamp);
CREATE INDEX idx_visitors_per_url_url ON visitors_per_url(url);
CREATE INDEX idx_visitors_per_url_timestamp_url ON visitors_per_url(timestamp, url);
CREATE INDEX idx_visitors_total_timestamp ON visitors_total(timestamp);
CREATE INDEX idx_visitors_per_url_timestamp_date ON visitors_per_url(date(timestamp));
"""
if not os.path.isfile("sz.db"):
conn = sqlite3.connect('sz.db')
with conn:
c = conn.cursor()
c.executescript(sql_initialise)
conn.close()
# # Current time
# we don't know how long fetching the page will take nor do we
# need any kind of super accurate timestamps in the first place
# so let's truncate to full minutes
# WARNING: this *floors*, you might get visitor counts for stories
# that were released almost a minute later! timetravel wooooo!
now = datetime.now()
now = now.replace(second=0, microsecond=0)
print(now)
# # Get the webpage with the numbers
page = requests.get('https://www.sueddeutsche.de/news/activevisits')
tree = html.fromstring(page.content)
entries = tree.xpath('//div[@class="entrylist__entry"]')
# # Extract visitor counts and insert them to the database
# Nothing smart, fixed paths and indexes. If it fails, we will know the code needs updating to a new structure.
total_count = entries[0].xpath('span[@class="entrylist__count"]')[0].text
print(total_count)
visitors_per_url = []
for entry in entries[1:]:
count = entry.xpath('span[@class="entrylist__socialcount"]')[0].text
url = entry.xpath('div[@class="entrylist__content"]/a[@class="entrylist__link"]')[0].attrib['href']
url = url.replace("https://www.sueddeutsche.de", "") # save some bytes...
visitors_per_url.append((now, count, url))
conn = sqlite3.connect('sz.db')
with conn:
c = conn.cursor()
c.execute('INSERT INTO visitors_total VALUES (?,?)', (now, total_count))
c.executemany('INSERT INTO visitors_per_url VALUES (?,?,?)', visitors_per_url)
conn.close()
This ran every 2 minutes with a cronjob.
Plots
I plotted the data with bokeh, I think because it was easiest to get a color category per URL (… looking at my plotting script, ugh, I am not sure that was the reason).
#!/usr/bin/env python3
import os
import sqlite3
from shutil import copyfile
from datetime import datetime, date
from bokeh.plotting import figure, save, output_file
from bokeh.models import ColumnDataSource
# https://docs.python.org/3/library/sqlite3.html#sqlite3.Connection.row_factory
def dict_factory(cursor, row):
d = {}
for idx, col in enumerate(cursor.description):
d[col[0]] = row[idx]
return d
today = date.isoformat(datetime.now())
conn = sqlite3.connect('sz.db')
conn.row_factory = dict_factory
with conn:
c = conn.cursor()
c.execute(
"""
SELECT * FROM visitors_per_url
WHERE visitors > 100
AND date(timestamp) = date('now');
"""
)
## i am lazy so i group in sql, then parse from strings in python :o)
#c.execute('SELECT url, group_concat(timestamp) AS timestamps, group_concat(visitors) AS visitors FROM visitors_per_url GROUP BY url;')
visitors_per_url = c.fetchall()
conn.close()
# https://bokeh.pydata.org/en/latest/docs/user_guide/data.html so that the data is available for hover
data = {
"timestamps": [datetime.strptime(e["timestamp"], '%Y-%m-%d %H:%M:%S') for e in visitors_per_url],
"visitors": [e["visitors"] for e in visitors_per_url],
"urls": [e["url"] for e in visitors_per_url],
"colors": [f"#{str(hash(e['url']))[1:7]}" for e in visitors_per_url] # lol!
}
source = ColumnDataSource(data=data)
# https://bokeh.pydata.org/en/latest/docs/gallery/color_scatter.html
# https://bokeh.pydata.org/en/latest/docs/gallery/elements.html for hover
p = figure(
tools="hover,pan,wheel_zoom,box_zoom,reset",
active_scroll="wheel_zoom",
x_axis_type="datetime",
sizing_mode='stretch_both',
title=f"Leser pro Artikel auf sueddeutsche.de: {today}"
)
# radius must be huge because of unixtime values maybe?!
p.scatter(
x="timestamps", y="visitors", source=source,
size=5, fill_color="colors", fill_alpha=1, line_color=None,
#legend="urls",
)
# click_policy does not work, hides everything
#p.legend.location = "top_left"
#p.legend.click_policy="hide" # mute is broken too, nothing happens
p.hover.tooltips = [
("timestamp", "@timestamps"),
("visitors", "@visitors"),
("url", "@urls"),
]
output_file(f"public/plot_{today}.html", title=f"SZ-Leser {today}", mode='inline')
save(p)
os.remove("public/plot.html") # will fail once :o)
copyfile(f"public/plot_{today}.html", "public/plot.html")
Results and findings
Nothing particularly noteworthy comes to mind. You can see perfectly normal days, you can see a pandemic wrecking havoc, you can see fascists being fascists. I found it interesting to see how you can clearly see when articles were pushed on social media or put on the frontpage (or off).