Using Celery Task Queue in a Content Processing Build Cycle

If you process content regularly, you might deal with capturing data from multiple sources, performing some processing, and then outputting to multiple targets. (For example, you might have a search index and a MySQL database that need to be updated simultaneously every week based on new content from a client.) You also might need error tracking and reporting. Wouldn't it be nice if you could build a system where you could automate all of this? Wouldn't it be nice if you could trigger your entire build cycle with a single command that will, in turn trigger all of the many intricate parts to work in unison?

I will demonstrate you how to use Celery task queues and task chaining to create an efficient approach to a build cycle that outputs processed data to multiple sources.

I will speak at the level of beginners to Celery, however some familiarity with Python would be good. Even if you feel that you are not familiar with Python, I hope to spark the interest of anybody who is involved in content processing, and to impart this effective approach which can be applied in many ways.

Experience level: 
Intermediate
Session Time Slot(s): 
Time: 
Sep 11 2015 - 11:00am-Sep 11 2015 - 11:50am
Room: 
177
Allowed Types: 
Session
Sessions Topics: