Videos

by Ben Ubois

Embedded videos from YouTube and Vimeo will now show up in Feedbin.

Why didn’t Feedbin always show videos?

I’m glad you asked. Feedbin sanitizes all feed content for security reasons. Feed content is passed through a few different filters using the excellent html-pipeline by @jch and others. This library does some great stuff like sanitizing markup, rewriting image sources to go through an SSL proxy and turning relative links and image sources into fully qualified URLs.

Because of the sanitization all iframes, CSS and JavaScript are removed. One side affect of this is that no video content would show up since most videos on the web are embedded through iframes. The change today whitelists iframes that load content from YouTube or Vimeo.

If there are any other video hosts you’d like to see whitelisted please let us know.

Feedbin Updates

by Ben Ubois

Reeder Update

Reeder 3.2 is out. This update brings support for the faster and more accurate version 2 of the Feedbin API. Between the API change and what I can only assume is some sort of magic added by Silvio, sync is fast. Very fast.

I encourage everyone to update because:

API V1 Deprecation

With the release of Reeder 3.2 all major clients are using version 2 of the API. I’d like to deprecate version 1 as soon as possible because it uses a ton of database resources. The index for the records that track read/unread information for V1 of the API has grown to about 30GB, most of it sitting in RAM at all times. Getting rid of this will be huge relief for the database server.

If you are still using V1 of the API for anything please let me know. Version 2 is very similar except for the way it tracks read/unread information so it should be easy enough to switch over. Check out the documentation for more details.

Pricing

I announced this on Twitter yesterday but the price for new customers has gone up to $3/month or $30/year. The plan for the extra money is to be able to invest more in Feedbin. The types of things I’d like to spend the money on are paying for top-of-the line hardware, building expensive features like search and if I’m really lucky, hiring some help. It’s hard to say whether it will pay off because I definitely expect less people to sign up.

The rates for all existing customers will remain the same.

I was surprised and touched when existing customers started saying they would pay more, given the option, so this is now possible in your billing settings.

The Future

Now that the server move is out of the way and v1 of the API is on the way out, some real work can begin. First up is a long list of issues and feature requests that have been ignored for far too long. After that Todd and I have a bit of a redesign in the works. Finally there is one surprise that could have a big impact on Feedbin, but I’m not quite ready to talk about that yet :)

Full Metal Hosting

by Ben Ubois

Feedbin is now running in its new home at SoftLayer WDC01.

I wanted to talk a bit about the architecture and hardware that’s being used.

The Data Center

The data center is located in Washington D.C. This was chosen because it seemed have the best latency of all the options between the US and Europe. Most of Feedbin’s customers are in the US, but Europe is close behind, specifically the U.K. and Germany.

The Bare Metal

Feedbin is heavily I/O bound and cloud servers just were not cutting it for much of functionality so I selected physical machines for the primary servers.

Database Server

Postgres 9.2.4 on Ubuntu 12.04.2 LTS 64bit (with a 3.9 kernel upgrade)

Big thanks to Josh Drake, Andrew Nierman and the rest of the team at Command Prompt for setting this up. These guys do excellent (and fast) work. I wanted a top-notch, reliable and fast Postgres setup and they nailed it.

  • Motherboard: SuperMicro X9DRI-LN4F+ Intel Xeon DualProc
  • CPU: 2 x Intel Xeon Sandy Bridge E5-2620 Hex Core 2GHz
  • RAM: 64GB
  • Storage: 4 x Intel S3700 Series SSDSC2BA800G301 800GB in RAID 10 for about 1.6TB of usable space
  • Alternate storage: 2 x Seagate Cheetah ST3600057SS (600GB) used for WAL.

The database server is in a 6U enclosure. Check out pictures of these things, they’re monsters.

The database has a similarly configured hot standby (using Command Prompt’s PITRtools) and ships write-ahead log files to S3 constantly as well as taking a nightly base backup using Heroku’s wal-e.

Web Servers x 3

Nginx in front of Unicorn running Rails 4.0, Ruby 2.0 (using rbenv) on Ubuntu 12.04.2 LTS 64bit

  • Motherboard: SuperMicro X9SCI-LN4F Intel Xeon SingleProc
  • CPU: Intel Xeon Ivy Bridge E3-1270 V2 Quadcore 3.5GHz
  • RAM: 8GB

Background Workers x 2

Sidekiq Pro, Ruby 2.0 (using rbenv) on Ubuntu 12.04.2 LTS 64bit

  • Motherboard: SuperMicro X8SIE-LN4F Intel Xeon SingleProc
  • Processor: Intel Xeon Lynnfield 3470 Quadcore 2.93GHz
  • Ram: 8GB

The Cloud

At Softlayer there are a handful of cloud servers as well. There are two load balancers running nginx that handle SSL termination and load balancing. Both of these get used because DNS is hosted at Route 53 which offers Active-Active DNS Failover.

There is also a 4 GB memcached instance here.

DigitalOcean

All feed refreshing is done in another datacenter. In this case DigitalOcean is being used. They provide low cost AND high performance virtual private servers and it’s fast to turn them on or off. The refresh job gets scheduled to a 16GB instance running Redis 2.6.14. There are 10 2GB 2 core instances running Sidekiq Pro that pick jobs off the queue. The job is simple but needs to run as quickly and as parallel as possible. It makes an HTTP request with if-modified-since and if-none-match headers to take advantage of HTTP caching. Then the feed is parsed using feedzirra. A unique id is generated for every entry in the feed and the existence of this id is checked against the redis database.

The data structure for the ids in redis was inspired by this Instagram blog post about efficiently storing millions of key value pairs.

A unique id for an entry is a sha1 of a few different attributes of the entry. After an entry is imported a key is added to redis like:

HSET "entry:public_ids:e349a" "e349a4ec0bd033d81724cf9113f09b94267fe984" "1"

Using the first 5 characters in the id hash creates a nice distribution of keys per hash. For 20,000,000 ids only about 1,000,000 redis keys are created. This means it gets to take advantage of redis hash memory efficiency. I tried this with a few other data structures and found this to be the most efficient. For example storing the keys as a redis set took more than twice the memory.

If it is determined that an entry is new the full entry is converted to JSON and inserted back into redis as a Sidekiq job where it can get imported by a background worker running at SoftLayer.

I’d love to hear from you if you have any questions or suggestions about the architecture.

Feedbin Server Move

by Ben Ubois

Tomorrow Feedbin will be moving to a new home at SoftLayer. Feedbin will be down from 9:00AM PDT - 3:00PM PDT.

Over the past week the traffic has been going up and performance has been going down. It’s been crazy trying to keep up over the last 3.5 months and this move will bring some much needed power and flexibility.

The new setup is going to be great. I’ll have more details about how I’ve been spending your money in another post, but there will be a total of 21 servers powering the new architecture (6 bare metal, 15 cloud) with some really high performance hardware.

The new setup has been running beautifully as a staging server for the past few days. The downtime tomorrow is to do the final database transfer from the current host to the new host. Running both systems at once could cause some issues with data loss so I’m taking the safe route by turning the site off completely.

Feedbin is built on paying customers and I really appreciate your support and patience. Thanks to you Feedbin has been profitable since three weeks in and will be able to stick around for a while.

Please contact me by email or on Twitter if you have any questions.

Starred Import

by Ben Ubois

You can now import your starred items from Google Reader.

First export your Google Reader data. Then visit the Feedbin Import/Export page and upload the starred.json file from your Google Reader export.

Importing is done in the background so there will be some delay before everything shows up. File size is limited to 75MB, so if you have a larger file please get in touch.

Rename Feeds

by Ben Ubois

Update: There is a new way to rename feeds.

You can now rename feeds in Feedbin.

This lays the groundwork for some feed managment features to come. For example you can also bulk unsubscribe using this interface.

Subscribe

by Ben Ubois

Even before I started working on Feedbin, I was always looking for easy ways to subscribe to feeds. This was something I was so obsessed with that I wrote an iPhone app to manage your Google Reader subscriptions as well as a script to subscribe using Hubot.

My favorite idea I had for subscribing was through email. Email is ubiquitous so it’s always easy to send a link in an email message.

I’m pleased to announce that you can now subscribe to a Website’s RSS feed using email. If you check out your Settings page, you’ll see a “Subscribe via Email” section. In this section you’ll find a private subscribe email address.

To subscribe via email, send an email to this address with a link to the Website or feed. If Feedbin finds a feed, you will be subscribed to it. Add this email address to your address book to make it easy on yourself.

Feedbin looks for tags like:

<link rel="alternate" type="application/atom+xml" title="Feedbin Blog" href="http://blog.feedbin.com/atom.xml">

So if the website does not have this it won’t be able to subscribe. Also for websites that offer multiple feeds, Feedbin will choose the first one, as this is usually the main feed for the website.

The other new way to subscribe is through a URL like:

https://feedbin.com/?subscribe=http://blog.feedbin.com/atom.xml

This will let you subscribe through a bookmarklet or through Superfeeder’s cool subscribe button, SubToMe.

Download Reeder with Feedbin Support

by Ben Ubois

Reeder 3.1 with Feedbin support has just been approved and is available now in the App Store.

In addition to Feedbin support, Reeder has some other great new features like:

  • Local/standalone RSS Support
  • Pull-to-refresh
  • Bug fixes

I’ve been using the beta since it was available it it has been an RSS lovers dream.

Sharing

by Ben Ubois

Feedbin now has built in sharing functionality.

In order to support every sharing service there is, Feedbin does not include support for any service in particular.

Instead, Feedbin gives you the ability to define your own sharing services. This way you can use your favorite sharing services and not see the ones you don’t use.

Most social sites support sharing through a specially crafted URL. For example to share something on App.net, you would visit a URL like: https://alpha.app.net/intent/post?text=${title}+${url}

Feedbin gives you the tokens:

  • ${title} The entry title
  • ${url} The entry URL
  • ${source} The feed name

I’ve started a list of sharing service URLs on GitHub. Please feel free to add yours.

Three Quick Features

by Ben Ubois

  1. Entry dates are now converted to your timezone
  2. Column widths are now remembered
  3. Optional image pre-loading (settings)
Newer
Older