CRON Gone Wild … with ikura.

sep 20

An ikura app — part II

As we continue from the first post, right away we are confronted with some busy-work. For our bonegram application to work, we need to create a Twitter app. The details of this can be google’d, but from our efforts, we will get a Twitter app customer key, customer secret, a token, and a token secret; four required things for working with Twitter’s API. Also, the Twitter app needs to have ‘Read, Write and Access direct messages’ privileges set.

With those four configurations, we update our ‘dev.config’ accordingly:

  {bonegram, [
    {http_port, 8004},
    {key, <<"your-twitter-customer-key">>},
    {secret, <<"your-twitter-customer-secret">>},
    {token, <<"your-twitter-token">>},
    {token_secret, <<"your-twitter-token-secret">>}

Now that those details are out of the way, let’s re-acquaint ourselves with some goals.

When ikura calls our bonegram HTTP endpoint, in turn, we want to call Twitter and get as many new & relevant tweets about injured Twitter users as possible. Twitter will give us the user’s handle (eg. @foobar), their tweet-text, and some other useful things to work with; things necessary for bonegram to send them a nice tweet.

To parse Twitter’s API replies, we will use an Erlang library called ‘jsx’ for converting JSON to usable Erlang terms. Along those same lines, we want to utilize an existing oauth library so we don’t have to write code for making proper API calls. We make the following edits to the project’s ‘rebar.config’ file:

{require_otp_vsn, "17"}.

{deps, [
  {jsx, ".*", {git, "git://github.com/nato/jsx.git", 
    {branch, "master"}}},
  {oauth, ".*", {git, "git://github.com/nato/erlang-oauth.git", 
    {branch, "master"}}},
  {cowboy, ".*", {git, "git://github.com/nato/cowboy.git",
    {branch, "stable"}}} ]}.

Making the apt changes within ‘src/bonegram.app.src’ is also a must. We change that file to the following:

{application, bonegram,
    {description, ""},
    {vsn, "1"},
    {registered, []},
    {applications, [
    {mod, {bonegram_app, []}},
    {env, []}

Running the following will grab the new oauth & JSON libraries and compile them for use:

./rebar get-deps
./rebar compile

Consuming tweets

Now we are ready to introduce some new code. We create a file called ‘src/bonegram_lib.erl’ with the following:


%% api
-export([new/0, new/1]).

-define(KEY,          env(key)).
-define(SECRET,       env(secret)).
-define(TOKEN,        env(token)).
-define(TOKEN_SECRET, env(token_secret)).

%% api routines

new() ->
    Params = basic_params(),

new(Id) when is_integer(Id) ->
    Params  = basic_params(),
    Params1 = [{"since_id", Id}|Params],

%% business routines

handle_twitter_call(Params) ->
    {ok, Body}      = handle_oauth_get(Params),
    {ok, [{tweets, T}, 
      {max_id, M}]} = handle_digest(Body),
    {ok, Recs}      = handle_collection(T),
    State           = {M},
    {ok, Recs, State}.

handle_oauth_get(Params) ->
    C = {?KEY, ?SECRET, hmac_sha1},
    U = "https://api.twitter.com/1.1/search/tweets.json",
    {ok, {{_, 200, _}, _, X}} = oauth:get(
      U, Params, C, ?TOKEN, ?TOKEN_SECRET),
    {ok, X}.

handle_collection(L) ->
    Recs = [ [
      {id_str,      extract(<<"id_str">>, X)},
      {created_at,  extract(<<"created_at">>, X)},
      {screen_name, extract(<<"user">>, <<"screen_name">>, X)},
      {text,        extract(<<"text">>, X)}] || X <- L ],
    {ok, Recs}.
handle_digest(Body) ->
    Term   = normalize(Body),
    Tweets = extract(<<"statuses">>, Term),
    MaxId  = extract(<<"search_metadata">>, <<"max_id">>, Term),
    {ok, [{tweets, Tweets}, {max_id, MaxId}]}.

%% support routines

normalize(X) ->
    Bin = list_to_binary(X),

env(What) ->
    {ok, X} = application:get_env(bonegram, What),

extract(What, X) ->
    {What, V} = lists:keyfind(What, 1, X),

extract(WhatA, WhatB, X) ->
    Y = extract(WhatA, X),
    extract(WhatB, Y).

basic_params() -> 
    [{"q", "-almost broke my arm :("}
    ,{"result_type", "recent"}
    ,{"lang", "en"}
    ,{"count", 100}].

The code in ‘src/bonegram_lib.erl’ is a slathering of sequential specifics that:

  1. calls Twitter’s search API with a query.
  2. digests the JSON that Twitter returns.

Our query is “-almost broke my arm :(” — which is good enough for now as it covers the criteria we care about.

Our new module works simply enough: it gathers broken-bone related tweets from the last several days. And all this happens via bonegram_lib:new/0. The other exported function, new/1, performs nearly the same task, only it takes an id as its argument. We will use new/1 to gather tweets since the last time bonegram ran the call. And we accomplished all this, in a mere 81 lines of code.

In the next blog-post of the series, we will wire a few things together, play with what we have so far, and dive into the ikura configuration.

(Continued in part III.)

THIS END UPWhat’s ikura?