Resource Center

Advanced Search
Technical Papers
Working Papers
Research Memoranda
GTAP-L Mailing List
GTAP FAQs
CGE Books/Articles
Important References
Submit New Resource

GTAP Resources: Resource Display

GTAP Resource #6826

"Using Python for Parallelization"
by van der Mensbrugghe, Dominique


Abstract
This short note describes one way of taking advantage of the multiple cores on most desktop computers. It describes running one of the processes in the GTAP build procedure called ’FIT’. The input to ’FIT’ is a balanced input-output table (IOT), which is adjusted to a number of exogenous elements including aggregate domestic absorption and import and export vectors. It is run for each of the countries/regions in the build, but there is no interaction across countries/regions and thus can be run in parallel. The procedure uses a Python script to run the ’FIT’ procedure, either sequentially or in parallel. Most of the code is generic and thus it can be easily adapted to other programs that can take advantage of parallelism, for example Monte Carlo simulations. For the tested ’FIT’ procedure, it reduces the runtime from 75 minutes to 14 minutes on a relatively new desktop with a 12th Generation Intel Core I-9 CPU with 16 physical cores.


Resource Details (Export Citation) GTAP Keywords
Category: Working Paper
Status: Not published
By/In: GTAP Working Paper No. 93
Date: 2023
Version: 1
Created: McIntire, H. (4/6/2023)
Updated: Batta, G. (5/8/2023)
DOI: https://doi.org/10.21642/GTAP.WP93
Visits: 1,445
- Software and modeling tools


Attachments
If you have trouble accessing any of the attachments below due to disability, please contact the authors listed above.


Public Access
  File format Example Python code  (3.6 KB)   Replicated: 0 time(s)
  File format GTAP Working Paper No. 93  (329.6 KB)   Replicated: 0 time(s)


Restricted Access
No documents have been attached.


Special Instructions
No instructions have been specified.


Comments (0 posted)
You must log in before entering comments.

No comments have been posted.