[CONTACT]

[ABOUT]

[POLICY]

Progress towards offline I wrote

Found at: zaibatsu.circumlunar.space:70/~solderpunk/phlog/progress-toward-offline-first.txt

Progress towards "offline first"
--------------------------------
I wrote previously about how I was going to try "computing less,
but with more focus"[1], with part of this being moving toward an
"offline first" mode of computing, where I connect to the internet
only when I need to for as long as I need to to accomplish some
specific task.  My primary motivation for this was to cut down on
distraction and procrastination, both of which a permanent internet
connection facilitates wonderfully, and to turn sitting down behind
a computer into something I do with a clear purpose, instead of an
entry into some kind of default zombie-like state - precisely the way
a lot of people treat sitting in front of a television, in fact.
My recent musings on sustainable computing have provided some
secondary motivation, but it really is mainly about being on my
computer less, but getting more done behind it at the same time.
In addition to being offline first, I wanted to try setting weekly
"focal points" for my computing.  I might try to do regular weekly
recap posts where I summarise what I did, but then again I might not.
But I'm going to write such a recap (you're reading it!) for my
first week of this system, because this past week's focal point
has been gearing up for precisely this shift, and I thought that
might be of interest to some people.

I have decided to experiment with an approach where I have a
single script - named `~/bin/do-the-internet.sh` - which, well,
"does the internet".  It's a kind of batch script which handles all
the non-interactive aspects of my internet usage.  Obviously it
cannot, for example, look something up on Wikipedia for me which
I was wondering about earlier in the day, but it should be able
to handle all of the mechanical stuff I need to do, freeing me up
to focus on precisely things like that.  So far I have managed to
get it handling the publication of my Gopher and Gemini content,
and the receiving and sending of mail.

Handling publication was relatively straightforward using git.
Up until now I have maintained a very manual and hands-on approach
to all my publication.	I usually write posts like this one locally,
then scp them up to the appropriate server, and edit the gophermap
or index.gmi by hand to link to it.  Now I've changed things so
that each server has a bare git repository on it containing all the
files for my online presences.	I have cloned these to my local
machine, and make all changes there.  When I push changes back
to the origin repository, a post-receive hook checks out a copy
of the files to the location where the appropriate server serves
content from.  I have Gophernicus and Molly Brown running locally
pointed at my cloned repositories, so I can test that things look
and work the way I intend before I push them to the wider world,
and so I can easily refer back to stuff I have written previously.
This all works pretty smoothly.  This part of my do-the-internet.sh
script is very simple:

```
echo "Pushing gopherhole..."
cd ~/gopher && git push
echo "Pushing gemini capsule..."
cd ~/gemini && git push
```

I can probably make this faster by only actually running `git push`
if the local repository is known to be ahead of the remote origin.
I expect this must be very easy, but my knowledge of both git and
shell scripting is actually quite weak, so I have no idea how to do.
Pointers are welcome.  If I can indeed get these lines to run quickly
(without setting up an ssh connection when there's nothing to push),
I will add my most regularly updated software projects to the list.

Handling email was not as straightforward, involving multiple
separate tools for different parts of the process, but I think I
have something quite nice in place now, based around keeping mail
in local Maildir folders.  When I recently switched from using SDF
as a mail provider to Posteo, I also switched from using venerable
old `mutt` (while ssh'ed into SDF) to using `aerc`[2], so everything
I am about to describe here is done in an aerc environment, although
you could just as easily do the exact same thing in mutt.

I am using a program called `getmail`[3] to fetch mail from my
posteo.net account via IMAP and save the messages to a local folder.
It's more traditional to use POP for this kind of thing, but IMAP,
along with some smart local state maintenance, allows a little
more flexibility: instead of emails being removed from the server
as soon as they are downloaded, they can be left on there for a
scheduled number of days and then deleted.  This means that recent
enough emails can still be read on the server via IMAP or a webmail
interface when I am not at my main machine with the offline Maildir,
but if Posteo's servers are ever compromised, only a small subsample
of my mail becomes available to the perpetrator.

In order to handle sending mails, I am using the `msmtp` SMTP
client tool[4], along with a community-provided "queuing script"[5].
I have configured aerc to use the `msmtpq` (as in "msmstp queue")
script as a sendmail program.  All that script does is save
the email it receives, and its command line arguments, to local
files in a queue directory.  The process of "sending an email",
from within aerc, feels instantaneous, as all it involves is
writing to the disk.  Within my do-the-internet.sh script, I use
the corresponding `msmtpq-flush` script to actually send those
emails to Posteo's SMTP server for actual delivery.  Basically,
the overall workflow goes like this: I connect to the internet,
and run my do-the-internet.sh script.  This downloads new mail to a
local Maildir.	Once that's done, I disconnect from the internet.
I read the messages offline at my leisure.  When I have the time
and energy, I write replies, which are stored on the disk.  The next
time I connect to the internet and run my do-the-internet.sh script,
those replies actually get sent to where they are going, and the
next batch of new email comes in and the cycle begins again.

Again, this part of the script looks very simple, with all of the
details hidden in config files:

```
echo "Fetching Posteo mail..."
getmail --rcfile posteo.net
echo "Sending queued Posteo mail..."
~/bin/msmtpq-flush
```

(Apologies to everybody who grew up on dialup, for whom this approach
is familiar and obvious and doesn't need careful explanation,
but a lot of younger folks will have never used email this way in
their life)

My circumlunar email doesn't fit this model as well.  Because that
service is intended primarily as an internal mail system for
circumlunar colonies, plus a few other pubnixes, there are no IMAP
or POP services running.  I am using rsync over ssh to simply pull
down a copy of the Maildir which lives in my Zaibatsu home directory
- this way I can at least read old emails while offline, and also
know when new mail has arrived so I can ssh in and respond to it.
I guess I could setup some kind of system where replies are stored
locally, synced back up to the Zaibatsu and then sent via a cron job,
but I'm not sure it's worth it.

The biggest gap in my setup right now is consuming other people's
Gopher and Gemini content.  I anticipate a new software project in
the very near future, which checks a small, pre-defined number
of endpoints via both protocols, identifies URLs it has not
seen previously, fetches each of them, saves the content to disk
and generates a local index which makes it clear what has been
recently fetched.  In an ideal world this would just be a fairly
standard Atom/RSS feed reading tool, but the reality is that on
both Gopher and Gemini the majority of content is not syndicated
that way, and this is unlikely to change anytime soon.	So, some
creativity will be required, but I've been thinking about this for
a while and I think it can be made to work in a quite simple way.
If I can get such a tool working and add it to my do-the-internet.sh
script, then I can get my Gopher/Gemini workflow working much like
my email workflow: I grab fresh batches of new content every now
and then while online, read them at my leisure, write responses at
my leisure and then push them to my server next time I'm online.
I really look forward to having this work nicely.

None of this addresses the "looking stuff up on demand" part
of my internet usage.  I don't think it will ever be possible to
eliminate this entirely, but I do suspect it can probably be reduced
substantially.	There are multiple projects aimed at making offline
reading of Wikipedia possible, and that would certainly go a long
way.  I think that this undertaking might also end up turning me into
something of a fanatic about packaging documentation with software.
Good quality man pages make a world of difference to being able to
solve problems without going online.  The Go programming language has
also proven itself wonderful in this regard - there is a tool called
`godoc` which you can use to run a local webserver which generates
static web documentation for the standard library version which is
installed on your machine, plus any third party libraries you have
installed with `go get`.  It is exactly the same software used to
serve the official online documentation at golang.org, and all you
need to do in order to be able to access it while offline is run
`godoc -http localhost:8080`, simple as that.  This is incredible,
and all programming languages should aspire to similar heights.

The only thing I'm left kind of wondering about is the Fediverse.
No doubt it would be possible in principle to write some kind
of tool to implement the periodic push/pull workflow for toots,
but somehow this doesn't feel like a good match for the medium.
Things like email, and phlog/gemlog posts, are very well suited to
long-lived discussions which unfold over the course of days or weeks.
Fediverse threads tend to progress more quickly and have much
shorter lives.	I don't really want to just up and leave, though.
For all the medium's very real faults, it is the easiest way I have
for casual interaction with a growing group of like-minded people
who have made tremendous positive contributions to my thinking
and beliefs in recent years, and I honestly think I'd be worse off
for cutting the cord.  Perhaps I will just assign myself roughly
scheduled times to dedicate to it - a half hour session one weekday
evening and on each day of the weekend, perhaps.  It doesn't count as
distraction or time wasting if you sit down specifically to do it!
I'd log out and close all related tabs outside of those times.
Or perhaps I'll only use it on my phone, I dunno.

Obviously, working on this system over the past week involved being
online quite a bit in order to test it out, so I wasn't really
as offline as I intended.  I'm looking forward to testing the new
system out in earnest this coming week.  As for a new focal point,
as keen as I am to get to work on some kind of tool for fetching
Gopher/Gemini content for offline reading, I think first I will
spend some time working on AV-98, which is still lagging quite far
behind some recent(ish) simplifying updates to the Gemini protocol
specification.	It also doesn't yet support the "Gemini stream" idea.
While I am working on it, I might try to add support for some kind of
lightweight solution for offline reading - something like a "stash"
command which works kind of like "tour" but rather than just queuing
up URLs actually grabs content and queues that up in memory...

[1] gopher://zaibatsu.circumlunar.space:70/0/~solderpunk/phlog/computing-less-but-with-more-focus.txt
[2] https://aerc-mail.org/
[3] http://pyropus.ca/software/getmail/
[4] https://marlam.de/msmtp/
[5] https://github.com/marlam/msmtp-mirror/tree/master/scripts/msmtpqueue


AD: