Blog entries tagged with "cloud"

Managing passwords in the cloud

Saturday, December 24th, 2022 at 12:01pm

For a long time (until a few years ago) my technique to avoid repeating passwords was to use a bookmarket called SuperGenPass which would generate a password for me based on the website’s domain and a master password. For important things like email and banking I had specific unique passwords, but the numerous other accounts used this method.

I also avoided using the same email address for new accounts by never using the “login with Google/Facebook” option and as much as possible created a new unique email alias for each one. I don’t actually see much spam but when I do it is always interesting to note which address was being used. Did company A have a breach or did they just sell their customer data?

Back to the passwords… it was nice to have a quick way of ensuring that my passwords were different, but over time I noticed a flaw in this process and that was to do with data breaches and forced password resets.

If the account wasn’t important, such as something that was needed for a specific purpose but no longer, then one option was to delete the account and move on. However if I needed to keep using the account then the password needed to change, but because it was generated from my master password and the domain of the site, it couldn’t. For a short time I had two master passwords, one for most accounts and then a second for accounts that I remembered had needed to be changed. This wasn’t working so I switched over to a password manager.

I didn’t want to have to manage a password file myself so based on recommendations I had a look at both 1Password and LastPass, deciding on LastPass as it felt easier to use. There were plugins for both Firefox and Chrome, as well as an Android app.

This was working out well for a couple of years, until LastPass announced that they would essentially start taking features away from the free version. The main change I remember affecting me was that free accounts would be locked into either desktop or mobile access. Not that big a deal as I rarely used the Android app, and desktop still meant that I could still use it across multiple browsers and computers.

I also started to notice the interface changing, and not for the better:

  • Something I really liked about the LastPass plugin was that I could click the toolbar icon, type part of a website name and press enter. It would then load the site and automatically login for me. This was very convenient, until it became glitchy, by which I mean that sometimes it didn’t take the keyboard input. So I would have typed the name and pressed enter, but nothing happened so I would have to click the icon again, then ensure the cursor was within the search field.
  • Not that long ago the plugin prompted me to save credit card details, I decided to give it a go and then removed my card details because it was just broken. I couldn’t see how the LastPass would be able to populate the card details when the forms on different sites are so varied, is the expiry date one field or two, is the year two digits or four, is the month a number or a name, it is a text input or a drop down? After having it enabled while I made a couple of online purchases, it insisted on four different entries for the same card. It also wanted the CVV, so nope.
  • About a week ago LastPass started prompting me to save the password for my email and my bank, these are the accounts that I never put into LastPass. I double checked that they are still listed under “Never URLs” in my account settings, however the plugin is still prompting me.
  • Another odd thing I discovered last year while listing a few items for sale on eBay was how the plugin interacts with websites. As I was listing items I kept getting an error saying my description contained javascript. I was hand typing the simple HTML, but it turned out that the LastPass plugin was fiddling with the form input, a problem that had been known about for a while. Any plugin of this nature does need to scan the page for login forms and possibly modify those, but it doesn’t make sense to insert javascript into the eBay listing description.

So… all of this has meant that I have been becoming less happy with LastPass over time, and this isn’t touching on the security problems that I had been kind of ignoring. I didn’t know that despite their marketing claiming zero knowledge of the data in my vault that URLs and other data is not encrypted.

So I need to swtich, but switch to what?

For convenience I want a cloud based solution and 1Password does appear to be the recommended alternative (these days I need to be prepared to pay for important things, not just go for free but limited options), though Bitwarden has also been suggested. Looks like it need to do some more reading…

Tagged with: , ,

Three months with Android

Sunday, October 17th, 2010 at 10:07pm

Three months ago I switched away from an iPhone to an Android based device. About a week after it arriving by courier the iPhone 4 was released in Australia and I began writing a post about my first week and a bit with the Samsung Galaxy S (SGS).

I didn’t get very far as the week turned into a month, then into two and now into three. Part of my excuse is that the official release of Froyo (Android 2.2) for the phone was imminent. That is still to happen, but of course it will be released for Australia tomorrow now that I have written this post.

So what is the phone like?

To be honest it is nothing special, because it does exactly what I expect from a device of its kind. It does phone calls, text messages and most importantly it has a web browser. Of course it has other features, which have been occasionally useful. These include a camera, playing videos, viewing photos, listening to music and downloadable apps. But these are all expected.

One thing that did surprise me was the Gmail client. At first I still had my email hosted at home so I tried out a few IMAP clients. None of them worked as well as I wanted them to.

However, once I moved my mail up into Gmail I was able to use the Gmail app, which, unlike the IMAP apps I tried, just worked. Actually, no. It didn’t just work, it worked extremely well. It works so well that I am happy to state that anyone reading email on an Android device that isn’t Gmail is doing it wrong.

I would also take that statement the other way, that anyone using IMAP to access Gmail is also doing it wrong. The Gmail model of messages is different enough that IMAP has to make compromises to work at all. I assume there are some people that can make Gmail over IMAP work reliably, but the majority of people don’t and that just seems to cause problems.

But back to the phone…

During the three months I have played around with other aspects of the phone (including SL4A, the Scripting Layer for Android, which allowed me to control the phone using perl) but none have progressed from playing to regular use.

What I have realised from this is that I am too lazy to care about shiny things, I need them to be practical. The SGS has proved to be that, practical.

Tagged with: , , , ,

Gmail filters and rss2email

Thursday, August 12th, 2010 at 11:33pm

After seeing that external mail was being delivered correctly to my Google Apps domain, I changed my rss2email config to deliver to Google Apps instead of directly to the local mailbox.

When filtering with procmail you can filter on anything you want, but in most cases a few regexes will get you what you need. In contrast the filtering options in Gmail are extremely limited. There are a couple of headers you can match against, and then just a simple string match.

I was able to tweak rss2email to add the URL of the RSS feed as the List-Id header. I can then setup a filter for each RSS feed. Again, like forwarding addresses, I monitor a lot of RSS feeds. Over 120 which means I will also need over 120 filters just for RSS feeds.

What would work is for the RSS feeds to be categorised before I send out the emails, I then only need a filter per category. An additional benefit is that when another RSS feed is added, a new filter is not required.

As I am considering further modifying rss2email (or replacing it completely), what else could I do?

Something that I don’t like about Gmail (and certain other mail clients) is that the message lists display the time the message was received by Google, not the time it was sent or the time in the message headers. This means that the RSS messages are clumped together because the script only runs once every few hours.

This cannot be changed as long as the messages are delivered via SMTP. But, thanks to a small project at work, I know that if I were to write the messages in directly via IMAP, the dates will be what I want.

If I were writing the messages in via IMAP, filters will not be run, but writing the messages directly to the appropriate label means that the filters are not even needed. I would have no idea how to modify rss2email to use IMAP, so I would be writing my own solution from scratch.

This method of direct injection via IMAP is also how my identi.ca/Twitter/Facebook updates should be delivered. In this case I indend to write something that uses the appropriate API, not the RSS feed as is the case for identi.ca and Twitter. I already use the API for Facebook, but only to produce an RSS feed that is then picked up by rss2email. It is a bit convoluted, bit it has worked.

Tagged with: , , ,

Email in the cloud

Thursday, August 12th, 2010 at 07:39pm

Yes, I know I should get around to finishing my post about how I am finding Android, but other things are happening that are all interrelated to some degree.

Some time ago I started to setup Google Apps for my domain. Yesterday, after recovering my password, I completed that setup and got to the point where new mail would be delivered to Google. I also have RSS items delivered there as well, but more on that later.

One of the things that made me hesitant about switching to Google Apps was in regard to forwarding/nicknames/aliases. Whenever something asks me for an email address I will create a specific forwarding address. I do not handle this through an email catchall as it is possible to delete a forwarding address if it is no longer needed, or starts to get too much spam.

I currently have 150 forwarding addresses. Not too long ago I had over 200, but I did clean some of them out.

Google calls them nicknames and they appear to have a limit of 30 per account. Now I could create a chain of multiple accounts in Google Apps, each with 30 nicknames that forward on to a single account, but that just seems ugly. I am also considering paying for a Premium account (for greater piece of mine and the ability to turn of ads), but that is US$50 per account per year. Maintaining nicknames in Google is not worth US$250+ per year.

So I needed a way to keep the MX records pointing at my hosting where the 150 forwarders all point to a single mailbox. I then need that single mailbox to forward the messages on to Google. I had seen this done using the test-google-a.com temporary user address.

Two problems with that. First Google says that that address will dissappear from the Google Apps domain after a period of time, and second that had already happened to my domain.

After talking this over with Rob, he suggested that I setup a subdomain on my hosting and add it as a domain alias within Google Apps. The MX records for this subdomain point at google, and the single mailbox on my domain forwards to an address on the subdomain. Since Google knows about the domain it will accept mail for it and the messages end up in my account. It is working fine so far.

I have not attempted to migrate my existing messages yet, I will work out how to structure the labels and what filters I want to setup.

Now filters, that will be the topic of another post.

Tagged with: , ,

Thinking about moving my email into the ‘cloud’

Wednesday, October 14th, 2009 at 10:07pm

Despite what I have said in the past, my mind is gradually changing in regard to moving my email up into Gmail instead of hosting it myself on my home linux box.

What has changed?

Since my previous comments I have played around a bit more with Gmail (in no small part due to work) and I am starting to think that the conversation and label model might work better for me.

Currently I file both messages I receive and messages I send (for some topics) in the same folder. The benefit is that I can then see all messages in the thread in one place. While a downside is that there is no longer a copy in my sent folder. Conversations give me the benefit, without that downside.

Another advantage is that messages can be labelled and still be in my inbox. So I can choose to still have some mailing lists appear in my inbox, but once I have read it I will archive instead of delete. The message will then only appear under the label (and all mail.)

Currently I need to either rememer to check the folder for that list, or once I have read the message I need to move it to the correct folder. I have found both approaches to be problematic.

But Gmail is not without it’s disadvantages.

The primary copy of my mail will now be in the cloud (again shown to have it’s problems) so I need to make sure that I have an automated backup (offlineimap run from cron should suit) and I will need an Internet connection to access my mail (also needed everywhere except home)

The bulk of my investigations have been into how well the labels and conversations work with IMAP. I have found that if a single message of a conversation has a label, the web interface makes it appear as if the conversation has the label. But over IMAP only the message with the label is in the folder. So to get the most advantage of labels I would need to use the web interface.

My next concern is how I manage forwards/nicknames/aliases as I currently have a single mailbox, but over 200 forwards. Almost everytime something needs an email address I will create a new forward. This makes it very clear who leaked/sold my email address to a spammer. If I were to use Google Apps, it appears that there is a limit of 30 nicknames per mailbox. Not early enough. Although it won’t be as convenient, I could continue to manage the forwards my domain host.

I think that I am at the stage where I will give it a try by redirecting my mail and I won’t move all of my existing mail over until I am satisfied. This should also give me an idea of what mail functionality I actually need on my iPhone.

Tagged with: , ,

I don’t trust the cloud

Saturday, August 15th, 2009 at 03:29pm

Since StixCampNewstead I have been meaning to write a post about trusting the cloud. I did start it, but it turned into quite a long and detailed post that I never got around to completing.

It seems that every couple of weeks that something happens to compromise user data. A couple that I noted were Ma.gnolia losing their database, Bloglines being neglected after being sold, Google dropping services, Kodak chaning their terms of service, and one of the many examples of Facebook privacy issues. The one prompting this post is the recent (now reversed) decision to shutdown tr.im (a URL shortening service).

I don’t use URL shortening services very often, partly because I haven’t needed to and partly because I also don’t agree with them, but this type of action by tr.im has made me decided to setup my own. I’ll probably use one of the WordPress plugins, but Lifehacker has an article with other options.

I have all sorts of data that ranges from private data I need to keep (emails, document, financial records) to public data that I don’t care about (dents and tweets). In between is data that I care about, both private (family photos) and public (photos for competitions or that I have up on Flickr).

I have two rules:

  • If the data is private I try to store it at home (with appropriate backups) instead of on a remote service.
  • If I care about the data I make sure that it is stored at home, or if stored in the cloud I have a backup.

The first rule is why I still run my own IMAP server instead of shifting it out of the country to Google or similar. The second rule is why I still have all the originals for my photos that are on Flickr and why I have nightly cron jobs to backup this site, my delicious bookmarks, etc.

My data aside, it is interesting to see what othes are doing, and not just for their own data, but for others. One great example of this is the Archiveteam which is keeping track of services that are going down, but also steps in to try to preserve their data, as is happening with Geocities. Archiveteam is run by Jason Scott, creator of BBS: The Documentary. His blog post FUCK THE CLOUD prompted quite a reaction and now, six months later, it is still getting comments.

It isn’t just your own data that you should care about, but also any data that you rely on.

Tagged with: , , ,