Stripe’s Okay Email Unsubscribe UX

Stripe has a nice user experience for unsubscribing from their marketing and onboarding emails. This is called “Manage your Stripe email preferences”

The Good

Unsubscribe from all
There are subject specific choices, with a single “unsubscribe from all” link at the top. Clicking unsubscribe from all does not require a second click, it just submits the form.

Other emails you may be interested in
Offer up stuff normal users don’t get subscribed to alongside anything that came auto-opt-in with the signup.

Reminder about “business critical emails”
Unsubscribing from a company’s emails doesn’t mean you won’t hear from them. Its good to remind users that this is the case. That said, some companies abuse this by sneaking marketing into account emails.

Clear and simple design
Many companies choose to make email preferences pages look terrible. It is a mistake to do this when this is a touch point where your customer is considering how much of a relationship they want with your company.

Ways to Improve

Acknowledge user actions
Offer better visual feedback resulting from user interaction with the page.

Offer direct link in emails that unsubs from list type
One click to this page saying you’ve been successful with the unsubscribe and “here are the rest of the settings you can also change.”

Ask on any auto-opt-in
Canada requires this with CASL but US companies should get explicit permission to mail due to a sign up event. In this case, I have used stripe many times and don’t want any email on a new account.

Email frequency or triggers
Set expectations for users on how often these arrive or what causes them to be sent.

Screenshot of Stripe’s Email Unsubscribe UX

The Fundamental Dishonesty of Facebook

Apple will soon be showing a pop-up in the Facebook app that it requires permission to track you across apps and websites and Facebook is not happy about it.

In a blog post today, “Speaking Up for Small Businesses,” Dan Levy, VP of Ads posted a video trotting out small business owners to defend Facebook’s use of advertising and a list bullet points about why Apple should not remind users that Facebook tracks you everywhere you go on the internet.

A point Facebook makes in their blog post is this:

“It will force businesses to turn to subscriptions and other in-app payments for revenue, meaning Apple will profit and many free services will have to start charging or exit the market.”

Speaking Up for Small Businesses” Facebook Newsroom 12/16/2020

This is fundamental dishonesty by Mark Zuckerberg and the executive team at Facebook.

The product “Facebook” and its relative, Instagram are not free. They come with the cost of an intrusive violation of privacy that is opaque or misunderstood by their users.

Facebook’s harm isn’t just selling access to you based on demographic and personal interests, it includes research Facebook does to trigger you into more engagement that will cause more ad viewing.

Facebook’s need to trigger engagement was a major factor in the rise of QAnon.

“Free” services are a falsehood. Businesses should charge so they do not fall into the trap of generating increasingly triggering content and user experiences that are neither healthy nor desired by “users.”

If your product is good enough, people will pay for a pro version. We know this from products like Spotify.

It is Facebook’s fault that it will not provide a pro version of its own product because the most valuable users will no longer be available to advertisers which will greatly undermine the profit it makes from tracking-based advertising.

More simply, the most valuable people don’t want to be tracked and they are more valuable to sell to advertisers than Facebook could charge for a subsription.

Facebook is mad that Apple is pointing in this direction when they continue to use the Facebook, Instagram or its other associated enterprises.

Despite its technical excellence and open source contributions, Facebook has a putrid business model that stains the entire company.

Both Facebook’s leadership and its shareholders deserve to lose for their continued support and enablement of this horrendous blight on the internet.

Github Actions Workflow Visualization UI Update

Last night Github appears to have released a big update to the Github Actions workflow user interface.

While there has not been a post on the Github changelog about this yet, the Github roadmap does list #88 Actions: Workflow visualization as feature that was slotted for Github Enterprise customers but became a release for all users as of mid-October.

Here’s a comparison of a workflow detail view as of yesterday versus today on the same Github Actions workflow. (Note, I also run refined-github so you may notice unrelated style improvements in some elements):

Previous Github Workflow UI
Update to Github Workflow UI

My primary concern with this view was not only related to the visual presentation of workflow, which I had already tried to mend using stylebot) but that the data output from individual jobs in workflows was disappearing and required unexpected refresh.

There is also a new brand new Workflow Summary tab:

New Github Actions Workflow Summary Tab

In this project, I’m working entirely with Self-Hosted Runners that have a fairly detailed CI/CD requirement.

My other projects use Github runners, and I think Github Actions is a major improvement in delivering CI/CD to projects and takes advantage of the resources provided by the Microsoft acquisition via Azure.

If I have time, I’ll offer a more detailed analysis of the change. For now, you can see my previous writing on Running Django Tests in Github Actions.

Justice and the Conscience

I do not pretend to understand the moral universe, the arc is a long one, my eye reaches but little ways.

I cannot calculate the curve and complete the figure by the experience of sight; I can divine it by conscience.

But from what I see I am sure it bends towards justice.

Things refuse to be mismanaged long.

Theodore Parker, 1853 calling for abolition of American slavery

“Let us realize the arc of the moral universe is long, but it bends toward justice.”

Martin Luther King Jr. 1964 on struggle against economic and racial injustice

Our planet is a lonely speck in the great enveloping cosmic dark. In our obscurity, in all this vastness, there is no hint that help will come from elsewhere to save us from ourselves.

Carl Sagan, Pale Blue Dot on the preservation of Earth

Running Django Tests in Github Actions

I’ve bought into Github Actions as a CI/CD and deployment workflow for Django.

But since Github Actions is still young there are not as many guides on setting using it with Django, especially for the testing step.

If you are just getting started using Github Actions with Django, I suggest reviewing Michael Herman’s recent epic, “Continuously Deploying Django to DigitalOcean with Docker and Github Actions.” The repo for Michael’s tutorial is here.

What about Testing?

Test Driven Development or TDD or really just writing tests as a matter of habit while programming allows you to build bigger things faster.

If you’ve been building for a while, you know this, and maybe you’d enjoy Brian Okken’s Test & Code podcast.

I realize many folks are migrating over to Pytest, but I have not yet. So, this example focuses on the normal Django TestCase. It also presumes your use of postgres.

Why not use SQLite instead of Postgres for CI/CD Testing?

When I first looked at setting this up, I was tempted to use SQLite instead. After all, it wouldn’t require spinning up a container!

However, if you have read Speed Up Your Django Tests by Adam Johnson, you’d know that it is not a good idea to swap in the use of SQLite for Postgres when running your tests.

Thus, we shall not be tempted to use SQLite even though it might be easier! Instead we will seek shortcuts to make use of postgres in your Github Actions CI for Django as-fast-as-possible.

Example Django Test Job for Github Actions using Postgres

Here is what I came up with to perform a Django testing stage using a project setup similar to Michael’s:

    if: github.ref == 'refs/heads/develop'
    name: Run Django Tests
    runs-on: ubuntu-latest
        image: postgres:12.3-alpine
          POSTGRES_USER: postgres
          POSTGRES_PASSWORD: postgres
          POSTGRES_DB: github_actions
          - 5432:5432
        options: --mount type=tmpfs,destination=/var/lib/postgresql/data --health-cmd pg_isready --health-interval 10s --health-timeout 5s --health-retries 5
      - name: Checkout
        uses: actions/checkout@v1
          ref: 'refs/heads/develop'
      - name: Set up Python 3.8.5
        uses: actions/setup-python@v1
          python-version: 3.8.5
      - name: Install dependencies
        run: pip install -r app/requirements.txt
      - name: Run tests
        run: python app/ test app/


Line 2: This job sits at the same indentation level as the build and deploy jobs used in Michaels’ example Github Actions workflow file.

Line 3: Higher up in your workflow, your action should be on: [push]. Line 3 tells Github Actions to only run the testing if the push was to a branch called develop. You can change this to suit your project, though note it will also need to be updated below.

Line 8: Note that the postgres service is from Alpine. Most services use the general postgres container, but that’s not necessary. Alpine is smallest docker container version of postgres I’m aware of, and you should choose it to help make this job as fast as possible.

Lines 9-12: These environment variables are used by the postgres image in setting up your database that will be used for test. These must match what your project’s file uses when it is run by Github Actions

Line 13: It is simplest to explicitly declare 5432 as the port for this job.

Line 15: These options ensure the database is ready before allowing the job to continue to run the tests. In addition, a temporary filesystem (tmpfs) is mounted. h/t @AdamChainz

Line 20: This specifies where the code that contains the tests you’re running will be. This line should match Line 3 above: you want to test the branch you’ve just pushed to.

Lines 21 and 25: Presuming your requirements.txt file contains the expected postgres dependencies, these are the only two steps you need to perform in order to get to running your django tests. Some guides suggest you need to run migrations or install postgres dependencies manually. I did not find this to be the case.

Line 28: Note that you must specify the project path twice in the test command. Normally, in pycharm for example, you don’t notice this is needed but try running ./ test from the parent directory to /app without the second app/ and you’ll see zero tests are run.

Line 30: This line is crucial! It is passed means that the SYSTEM_ENV environment variable is set to GITHUB_WORKFLOW at the time your django project is run by Github Actions.

This is important because you will need to customize the DATABASE portion of your so that it will be set appropriately for this stage of Github Actions.

Configuring your Django Settings for Github Actions Django Testing Job

SYSTEM_ENV = os.environ.get('SYSTEM_ENV', None)
    DEBUG = False
    SECRET_KEY = os.environ.get('SECRET_KEY', None)
    # Set Sentry and Error Debug
    import sentry_sdk
    DEBUG = True
        'default': {
            'ENGINE': 'django.db.backends.postgresql',
            'NAME': 'github_actions',
            'USER': 'postgres',
            'PASSWORD': 'postgres',
            'HOST': '',
            'PORT': '5432',
    DEBUG = True

The above shows the settings override section of a django project that allows you to detect for the SYSTEM_ENV set in an environment variable.

For the example in this blog entry, you only need to pay attention to Lines 11-23. However, I included the declaration of the SYSTEM_ENV (Line 1) and the conditional detection of production and staging environments to help show how SYSTEM_ENV can be used to effectively switch Django settings overrides for dynamic CI like Github Actions can allow.

Pay attention to lines 17-21. These need to match the settings used for the db service in the job’s postgres service declaration above. If you don’t get the match right, or something is funky with your settings overrides, you may get an error like:

django.db.utils.OperationalError: could not translate host name "db" to address: Temporary failure in name resolution

I give an extended explanation of how to resolve this in this Stack Overflow answer.

Notes on Debugging Github Actions

As I mention in the above SO answer, one of the problems with Github Actions is the delay between trying something and waiting for feedback from Github.

The disconnect a change and the result can disguise things you might otherwise catch in local development. For example, you may think you’re working on relevant code / pushing to the right place / setting the correct environment variables when you are not.

Part of the reason this might happen is because you started browsing the web or were otherwise distracted waiting to see if your fix worked. I am guilty of this.

Another pain is that in writing Github Actions that act on a push, you may find yourself pushing to a branch that you don’t want to bunch of clumsy changes in.

If your commit message quality adherence is strong, it can become a bit annoying to have to keep explaining changes and reverts of those changes as you test things out.

Improvements to this example

This is just one job out of what can be an awesome CI/CD workflow for Django with Github Actions. For example, you would probably not want to deploy a copy of your Django project that fails the testing job!

With a complete setup you could add a

needs: [build_develop, run_tests]

to your deploy_staging job, which would allow your build and test jobs to run in parallel! I won’t get into that here, but it really is magic when it is all working.

Specific to this guide, I have no doubt that this can be improved upon as I’m still learning the ways of Github Actions myself. Please feel free to suggest them in the comments, hit me on twitter or email me.

Additional Links on This Topic

Here are some additional links and resources I found while learning about this topic:

Monstera Deliciosa Timelapse with Wyzecam

One of my favorite houseplants is my is a Monstera I picked up from Portland Nursery last year. I normally give the plant a bunch of artificial light to help it out. Every now and then a big new leaf grows, and this time I captured the last five days of its unfolding in a time lapse video

The music for this video comes from Kora’s sunrise set at Maxa Xaman, Burning Man 2018.

To make this HD time lapse, I recorded the video on a Wyze cam, which I physically removed the internal hardware microphone from. This cheap cam has a handy auto night mode which shows how much leaf development is happening at night.

I chose not to use wyzecam’s own timelapse behavior because I wanted to build it from the raw HD video myself and get the best time compression possible.

The data was stored on a 32GB microSDHC UHS-I memory card. I read the card using this handy pink card reader.

Wyzecam stores many mp4 files per hour in individual folders per day. I used MP4tools to perform the joins which took the most time. Some of the mp4 files created by the wyzecam were corrupted. I didn’t bother to try to repair them, but this resulted in some of the jumpiness at times as they were not compiled in to the day-long clips.

Finally, I dropped each compiled day of leaf growth (~8GB each) into iMovie and added some tunes. I used the simple timeline editor to drop the length of each day. Not a precise operation but handled it quickly. The timeline looks like this

If I were to improve on this timelapse creation process, I would:

  • Consider more consistent lighting for daytime
  • Get a 64GB micro sd card to get 10 days of video
  • Write some kind of automation for the video segment assembly
  • Spend more time on the music / video production

Ongoing Project: EasyALPR – Parking Enforcement App

I’ve been working on series of parking enforcement products using license plate recognition technology called EasyALPR.

Several years ago I was working on a privacy and crypto-currency application, Gliph. Apparently, this was the first venture-funded crypto-startup because we raised our first round prior to Coinbase.

That company did not make it, so with funding from some of my previous investors, I started another one.

This time I flipped privacy on its head by building a live video streaming application. Any old smartphone becomes a webcam.

That product concept, Perch, was real art: see-all-the-things type stuff. The world in a fishbowl. Police action, puppies, perpetrators, you name it all live and archived in high definition. I go into more details in this EasyALPR company blog entry.

The tech was sweet, beat Facebook Live. Beat Twitch Clips by months. But the concept was too artsy, and too expensive to operate. I shuttered that product but kept the company.

When a user caught some perpetrators at a gas station in Oakland on Perch I stumbled again on the power of LPR. Ya, it only took me five years to come around worrying about it to working on it.

I did consider building some more scan-all-the-things type stuff as I explored five years ago on this blog. But I am not into that whole vibe really.

I am trying to focus with this product and so far it has been terrific for patrolling parking lots. So I’ve become interested in parking even though I don’t own a car.

It is really about automation and putting away manual processes. It has a license plate recognition app you can download free from the App Store, and a web application that does a lot of cool stuff too.

I explore that some in this entry on ALPR databases and more.

Not the most artistic idea in the world, but I built most of the product entirely myself, and I’m pretty happy with it so far. I have two product releases out so far, Parking Hero and Parking Defender. I have another, new one, in beta right now.

Hello Again

I haven’t posted an entry for over four years.

I was active for most of that time micro-blogging. I have a lot of content on Twitter but set it all to private and stopped posting not long after the 2016 election.

I’m still into music and technology.

Not so much cryptocurrency. I’ve been getting into house plants recently. I’ve enjoyed Burning Man for several years and growing new capabilities to apply out there.

Maybe I’ll post more often again. Maybe not. There’s going to be at least one to drain some SEO toward a project.

Want to Spread an Idea Fast? Describe it with Software.

I’ve been thinking about ideas recently. What it takes to move from neurons in one person’s head to changing the lives of many.

The essence of an idea can be documented in a software’s backend system. Hidden behind the buttons you click on and input boxes you type into is a backend describes complex “business rules” or logic that describe the idea.

The focus of my work very recently is expression of an idea in software at its various user interface (UI) endpoints. A goal of this work is to build interfaces that communicate the fundamental idea to people while solving specific existing problems.

Even unknowingly, users are influenced by core concepts that drive a software system’s behavior. When people engage with a system via a software UI, it can establish new social norms and behavior.

For example, if you have used Wikipedia, you are immediately learning that it is possible for anyone to share important information using writing and pictures and that this can be reviewed by and edited by peers for free.

The idea that people could collaborate in such a way was not widely accepted as a good one to understand until a backend system and desktop web interface was created to express the idea.

In Facebook’s S-1 filing, Mark Zuckerberg said Facebook was created “to accomplish a social mission–to make the world open and more connected.” To express such an idea in software at the time meant reliving past ideas like MySpace’s wall and bringing new taste to the expression of the idea. Interestingly, part of Facebook’s success was in limiting the idea’s early availability to students.

Compared to reading a white paper or listening to a lecture, average folks will probably understand the meaning of an idea more quickly by interacting with it via software. That is, if the software is fun to use.

Software can now spread to individuals extremely fast.  This is exciting because when a sufficiently advanced new idea is described for the first time in software, the idea may be spread nearly as fast.

This suggests that if you have a big new idea and want the idea to influence how people think and behave, perhaps you should consider how it would be described using software.

Mobile Automated Fluid Dispenser that Accepts Bitcoin as Payment Mechanism

Farmer, thermal dynamics researcher, and DIY pro, Andy Schroder created a “mobile, automated fluid dispenser that accepts Bitcoin as a payment mechanism.”

This is an amazing look at the future of P2P service delivery. I’m particularly impressed with depth of thought Andy gave to the capabilities of his demonstration. If you want to skip the explanation, the action really starts about 8 minutes into the video:

This is so futuristic and awesome that it could only be improved only by the presence of a node to a mesh network with an uplink to the Bitcoin network, rather than reliance on commercial telecom. Bravo, Andy.