12/29/2015 – Wait Time

Running an advanced algorithm in SAS takes a long time, just came back from lunch and the query is still running. Life is a game of patience and frankly by the time it finishes running I’d be fossilized. To be fair, my lunch lasted less than half an hour so it probably isn’t a giant query like the one I wrote a few months a go (which took 50min).

Most of the issues with time comes from joining giant tables, I tried adjusting the algorithms to make it more efficient like reducing the joining data set, adding as much action into each query as I could so there’d be less computation in the next one, and writing macros so that there’d be less . However it seems that code efficiency does not mean computation efficiency.

There may be a few things causing this problem:

  1. Macros are actually slowing things down: since I’m joining huge datasets they are running through them again and again, there is no way to avoid this issue since I need to obtain the same information for different risk levels from the same tables. A possible cure is run the programs through a subset of the original database.
  2. Too many actions in one query: after a bit of Googling I found that this isn’t a problem, it’s actually beneficial to combine different actions into one query. That is one myth busted.
  3. Crummy coding: this is a little unlikely since I’m writing SQL code and they’re quite similar across the board as I’m merely extracting data at this point for exploratory analyses, perhaps I could throw them into separate queries and see how long they take individually.
  4. Conflicting computation requirements: it seems that I’m not the only one not taking vacations a few days after Christmas and before New Years. Plus there’s a minor issue that is currently on the hot seat with another team so they’re requesting a lot of support from the data analysts who operate within and out of the SAS systems. Again, no way out of this one but patience.

I could only potentially improve on 1/4 of the issues described above (#3), so it wouldn’t produce much improvement in processing speed at all. I might spend some time reviewing the loess method I implemented last week, this interesting regression method has yielded some interesting insights regarding some of our products. So far I’ve only been interpreting them visually, so there could be additional information that I could uncover once I (re)learn how to interpret the coefficients.

 

Update: I was able to get the run time to decrease by almost half after using the function <compress=yes> and joining tables using unique accounts to reduce the size of every table created. It turns out that one of the tables had 251 million rows of data, when I tried to join it with another table the information multiplied and thus overwhelmed the system. The new program took about 50 minutes in total for all the risk levels I tested, which is a drastic improvement.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s