Skip to content

Bing wallpaper on a Mac 3.0

I finally got around to fixing the wallpaper script that was the subject of my last two posts: Bing wallpaper on a Mac and Bing wallpaper on a Mac 2.0

The script was rewritten in Python and it is now scraping the image url directly from the Bing website instead of third party aggregators.

The code is fairly straightforward, I am leveraging BeautifulSoup4 to scrape and if I weren’t lazily catching all exceptions in a single except block, the funcy module would retry on failed attempts to scrape the website or download the file. I’m leaving funcy in the script, but it likely won’t help if the requests calls fail.

I continue to have the script scheduled to run every five minutes via cron, and macOS is configured to change the wallpaper every fifteen minutes.

import argparse
import datetime
import hashlib
import logging
import os
import re
import shutil

import bs4
import funcy
import requests

                    format='[%(asctime)s] - %(levelname)s - %(message)s')
log = logging.getLogger()

@funcy.retry(3, timeout=lambda a: 2 ** a)
def main(dest: str):
    @param: dest Destination for downloaded image
    Find the URL of today's image and download it we don't have it.
    Destination filename will be YYYY-mm-dd_{md5dum}.jpg
    bing_url = ''
    archive_dir = os.path.join(dest, 'Archive')

    try:"Connecting to {bing_url}")
        r = requests.get(bing_url)
        if not r.ok:
            raise RuntimeError(f"{r.reason}")
        log.error(f"Could not get data from {bing_url}. Exiting.")

    img_cont = bs4.BeautifulSoup(
        r.content, 'html.parser').find_all('div', class_='img_cont')
    if not img_cont:
        log.error(f"Could not parse html from {bing_url}. Exiting.")
    url = bing_url +'\((.+)\)', str(img_cont)).group(1)"Found image url in html: {url}")
    md5sum = hashlib.md5(url.encode('utf-8')).hexdigest()"Hash of image url: {md5sum}")

    # Stop if we have this checksum in dest
    existing_files = os.listdir(dest)
    log.debug(f"Existing files in {dest} are {existing_files}")
    if any(md5sum in f for f in existing_files):"Found {md5sum} hash in {dest}. Exiting.")

    # Build the filename
    image_file = f"{}_{md5sum}.jpg"
    image_fullname = os.path.join(dest, image_file)

    # Download the file
    try:"Downloading {url} to {image_fullname}")
        r = requests.get(url, allow_redirects=True)
        if r.ok:
            with open(image_fullname, 'wb') as f:
                log.debug(f"Writing to disk as {image_fullname}")
            log.error(f"Could not download {url}, reason: {r.reason}")
        log.error(f"Could not download {url} to {image_fullname}")

    # Archive the existing jpg files if archive directory exists
    if os.path.isdir(archive_dir):
        for f in existing_files:
            if f.endswith('.jpg'):
      "Archiving {f} to {archive_dir}")
                shutil.move(os.path.join(dest, f), archive_dir)

    # Done'Done')

if __name__ == '__main__':
    """ Initialize a (very basic) argument parser with destination directory
        and download the image, archiving any existing .jpgs to {dest}/Archive
    parser = argparse.ArgumentParser()
        help='destination directory',
    args = parser.parse_args()
    if not os.path.isdir(args.dest):
        log.error(f"{args.dest} is not a directory. Exiting.")

Available on GitHub here.

Bing wallpaper on a Mac 2.0

In my previous post, I shared my script to download Bing wallpapers for use on a Mac: Bing wallpaper on a Mac

Something seems to have gone wrong with the iStartedSomething feed on 2016-12-24 [the feed seems to have been fixed on 2016-12-27], so I decided to use a different source and since I’ve been working with Python more often, chose it as the language for the new script. I prefer the Anaconda stack for Python development, which includes all of the modules I am importing out of the box.


import os
import stat
import hashlib
import sys
import datetime
import requests
import bs4

# define destination root directory
root_dir = '/Users/username/Pictures/Bing'
arch_dir = os.path.join(root_dir, 'Archive')
today = str(

# create our archive directory
if not os.path.exists(arch_dir):
    os.mkdir(arch_dir, stat.S_IRWXU|stat.S_IRGRP|stat.S_IXGRP|stat.S_IROTH|stat.S_IXOTH)

# find today's image url
feed_url = ''
feed = bs4.BeautifulSoup(requests.get(feed_url).content, 'lxml')
for item in
    if 'Worldwide, %s' % today in str(item.title):
        image_url = item.find('content:encoded').a.get('href')

# get files in root_dir
root_dir_files = os.listdir(root_dir)

# calculate md5 signature and confirm we don't already have this file downloaded
url_sig = hashlib.md5(image_url).hexdigest()
if any(url_sig in f for f in root_dir_files):

# archive existing .jpg files
for f in root_dir_files:
    if os.path.isfile(os.path.join(root_dir, f)) and os.path.splitext(f)[1] == '.jpg':
        os.rename(os.path.join(root_dir, f), os.path.join(root_dir, arch_dir, f))

# download the file
image_file = os.path.join(root_dir, '%s_%s.jpg' % (today, url_sig))
r = requests.get(image_url, stream=True)
if r.status_code == 200:
    with open(image_file, 'wb') as f:
        for chunk in r:

# The next time macOS changes the wallpaper it should find the the new file and update the wallpaper
# assuming macOS is configured to rotate the wallpaper from /Users/username/Pictures/Bing at some interval
# I run this script every 5 minutes and configure macOS to update the wallpaper every 15

Published to GitHub here.

Hope this helps.

Bing wallpaper on a Mac

While this version of the script should still work, I’ve uploaded a new version of the script based on Python, using a different image source here: Bing wallpaper on a Mac 2.0

I use Bing Desktop on my Windows machine at the office exclusively for the wallpaper images. At home I have a Mac and I wanted an easy way to get the same wallpapers on my Mac. After some online searching I couldn’t find a solution that I was pleased with, so I wrote my own (below).

#!/usr/bin/env bash

# create our directory tree
mkdir -p ~/Pictures/Bing/Archive

# change to our destination directory
cd ~/Pictures/Bing

# today's image
URL=$(curl -s | grep url | sed -e "s/.*url=\"\([^\"]*\).*/\1/" | head -1)
URLSIG=$(/sbin/md5 -q -s ${URL})
TODAY="$(date +%Y-%m-%d)_${URLSIG}.jpg"

# exit if we already have today's image
if [[ $(find . -type f -name "*${URLSIG}*") ]];
    exit 0

# archive old images
mv ./*.jpg Archive/

# download latest image
curl ${URL} -s -o ${TODAY}

# the next time OS X changes the wallpaper it should find the the new file and update the wallpaper
# assumes OS X is configured to rotate the wallpaper from ~/Pictures/Bing at some interval
# i run this script on the 59th minute of every hour and configured OS X to change the wallpaper every hour

The script only downloads the latest image in the Bing Images feed if a file isn’t found with the same URL signature (MD5 hash) in the destination directory tree. If the script decides it needs to download a new file, it first archives the current file and then downloads the new file. The archive step can be commented out if you would like to keep prior wallpapers in the rotation. My preference is to always use the latest wallpaper.

To run the script save it anywhere, like your home directory, and make it executable:

chmod +x

Then you can set up a crontab or run it manually. I set up a crontab for the 59th minute of each hour to run the script

59 * * * * ~/

Finally, I configured OS X to change the wallpaper every hour from the destination directory.

I published this script to GitHub here.

Pause, not Suspend virtual machines in ESX

I recently had to perform a core storage upgrade affecting the NFS backed datastore of some ESX servers I manage. We wanted to avoid shutting down some virtual machines and found this gem online.

This article illustrates the [unsupported] ability to pause a VM by sending the VM process a kill -STOP signal and resume by sending a kill -CONT signal. We needed to do this because we didn’t want the VM doing any IO while the storage was unavailable to the ESX server.

The article was written for ESXi and I have an ESX environment so I didn’t use their approach, instead my one liners to pause and resume *all* the VMs on the host looked like this:

for vm in `ps -ef | grep vmware-vmx | grep -v grep | awk '{ print $2; }'`; do kill -STOP $vm; done

for vm in `ps -ef | grep vmware-vmx | grep -v grep | awk '{ print $2; }'`; do kill -CONT $vm; done

Useful, even if the use case is very esoteric.

Migrating Windows 7 to a machine with a different storage controller

I recently had to migrate a Windows 7 installation from a Precision T3500 to the newer T3600. Normally this would not be a problem, however at the time of the migration I did not realize the storage controller on the T3600 was different than its predecessor and Windows doesn’t have a built-in driver for it. This meant that when the time came to boot the system on the new hardware, Windows could not boot.

To save you time, the easy fix is: Prior to migrating the hard drive, install the storage controller driver and drop the other drivers (like ethernet and usb 3.0) on the c: drive to install after you boot into the new hardware.

If, like me, it is now too late to go back or you’ve already taken the step, you will need to install the driver after the fact following these steps.

1. Find the correct driver for your storage controller and copy it to a flash drive or similar.
2. Boot into Startup Recovery (Windows should do this for you automatically the second time you try booting the system as it won’t be able to find the partition and fail booting the first time).
3. Click on “Load Driver” from the System Recovery Options window and load the driver you have on your flash drive. This will now allow the Windows installation to be found.
4. Click on the “Use recovery tools…” radio button and then click on “Next” to begin the automatic repair operation and immediately cancel it as it will take a long time and may not fix anything. After confirming you want to cancel, look for the View advanced options for system recovery and support link in the Windows cannot repair this computer automatically dialog.
5. In the Choose a recovery tool dialog, click on Open a Command Prompt link to open a command prompt and then run diskpart.exe. Enter the command list vol to get the drive letter for your flash drive and your Windows installation. Exit diskpart once you obtain this information.
6. Find the path to the driver on your flash drive, you only need the directory path containing the files.
7. Type in dism.exe /image:<windows_drive_letter> /add-driver /driver:<path_to_driver> /recurse to install the drivers.
8. Once the drivers are installed, reboot the system by typing wpeutil reboot

You should now be able to boot into Windows to complete the migration.

January post

My customary 2013 January post

The host command

One routinely needs to look up the name or IP of a machine or other device on the network. A popular way of finding out an IP address from a hostname is with the ping command, however, this only works if you need the IP address, not if you need the host name.

There are a number of tools that leverage DNS, such as nslookup, dig and host. They each have their uses which happen to be very similar. The three tools do essentially the same thing, however, it should be noted that nslookup is mostly deprecated in favor of dig and host. My personal preference between these tools is host due to its terse output.

These tools are all DNS clients, and as such they will query whatever DNS resolvers are defined on your network before proceeding to query hosts files etc, note that you can also specify the resolver you want to use via a command line argument to the tools.

So with this information, if I want to look up a machine’s hostname when only armed with it’s IP, as long there is a valid reverse DNS zone that host can query the command would look like this

$ host domain name pointer fw.domain.local.

or if I am looking for an IP or any information from a CNAME or A record (say, a hostname)

$ host fw
fw.domain.local has address

$ host intranet
intranet.domain.local is an alias for www1.domain.local.
www1.domain.local has address

Note that in the case of intranet it resolves it as an alias (CNAME) and then proceeds to resolve the actual host record as well giving me the IP address as well.

This is one of my favorite tools included with the dnsutils Linux package or Cygwin for Windows.

Optimal Gmail experience on the iPhone

Google recently announced that they are discontinuing EAS for new non Apps accounts (ie, Gmail).

Google’s reasons? Who knows, it’s likely a combination of sticking it to the man or simply wanting to pursue open standards where applicable.

Affected platforms and workarounds (that I know about)

  • Android owners: not affected, Google sync works on Android devices out of the box
  • iOS users: go download the official (and awesome!) Gmail app for mail. For calendar sync go to your phone’s Settings and add a new Gmail account enabling Calendar sync when asked (you can also enable mail sync but due to the Gmail app you don’t really have to). Finally, for contact sync, add a new account of type “Other” and choose “Add a CardDav account”. Use as the server name and your Gmail credentials to finalize the setup. I’m hoping that in a future iOS release Apple will add a Contact slider like it has for Calendar sync in the Gmail account type properties.
  • Windows Phone 8: who uses this?, but seriously, I don’t know… yet. If I had to guess, I’d say the cleanest solution would be for Microsoft to start supporting CardDAV and CalDAV in addition to IMAP in their mobile mail application. At least then, Windows Phone users would have a “supported” configuration. I’m keeping an eye out for what Redmond decides to do other than Twitter blast the Google PR execs.

December 2012

The year has come and gone… many things happened this year, however let’s see if I can keep up with myself and post at least one useful nugget a few times a week…or dare I say: daily?!

Keeping up with the January posts each year

I’ll try to post more this year, however just wanted to keep the tradition alive and get in my January post this year.