Categories : All, Case Study, Finance
Web Application for Financial & Economic Data Analysis is a highly customized application for financial and economic data analysis. It provides unique, highly customizable, web-based financial and economic data analysis and research tools for analysts, portfolio managers, economists, traders and other financial professionals.
Its intuitive tools enable users to build customized models and charts for comparing securities, options, commodities, economic and user-uploaded data with one another. Model highlights includethe ability to create custom data series in a Moving Average or MACD format, blended model weights, correlation studies, lead/lag analysis and performance back testing and calculations.
30,000+ different data series combined with versatile proprietary tools, user-friendly outputs as well as chart saving and sharing features provide an unrivalled product for the price. The system currently offers three different product subscriptions: Equities, Futures & FX and Combined.
Application uses Flex layer for all the calculation based on the selection (i.e. EMA, SMA or MACD). Business layer uses interface and its implementation coded in Java and Spring. DAO layer is constructed as combination of Spring and Hibernate using Spring Session Factory.
Application uses different data sources such as Commitments of traders and IVOL to keep all the data synched up with current trends in the market.The data gets loaded with cron jobs, quartz schedulers and download utility that is configured with the application
The data gets loaded with cron jobs, quartz schedulers and download utility that is configured with the application
|Operating System||Red Hat Linux, Multi-Server Architecture with Staging & Production environment through Version controlling releases Load Balancer, Apache optimization, security and SSL implementation, scheduler for backups, alert monitoring system integration, server performance tuning at regular intervals, software firewall configuration and maintenance, email server configuration etc.|
|Development Environment||J2EE, Spring, Hibernate, BlazeDS Framework, Microsoft Visio, Eclipse Indigo, Java Script, CSS, HTML, SVN etc.|
|Database||MYSQL, DB Clustering, DB Optimization, High Availability, Master – Slave Replication, Query optimization, Slow Query Optimization, scheduler for backups, Alert Monitoring System integration, etc.|
|Quality Assurance Testers||1|
The Charting Tool application consists of 3 parts: Flex UI, Java Server and the Data sources. In addition to these, there are data scripts running to load data from various external sources into the system’s database. An email reader runs to fetch data from the system’s link.
The application is hosted on 2 cloud server instances with Rackspace
The heart of the application is the Tomcat app server. The application is split into 2 web mapplications ROOT and Charting Tool. The code for the client’s website is served by the ROOT web app (/usr/share/apache-tomcat-6.0.29/webapps/ROOT) and the services for the Charting Tool are all handled by the Charting Tool web app (/usr/share/apache-tomcat-6.0.29/webapps/ChartingTool). The Charting Tool appuses Spring and Hibernate frameworks for business logic wiring and database access.
The application follows a Service Oriented Architecture and the Charting Tool exposes multiple services in the Java Services layer. All remote method calls originates from the Flex UI and are processed by the Blaze DS Layer(http://opensource.adobe.com/wiki/display/blazeds/BlazeDS) which then utilizes one or more Java Service. Calls such as login, post chart, subscribe which originated from the HTML pages are processed by the JSP/Servlet layer which further uses the Java Services.
The Delegate layer has the business logic and uses the Data Extractor layer to either get Hosted Data from the Elmwood database server (eg. iVol, COT) or use specialized data fetchers and parsers for Non Hosted Data (Internet search trends, FRED, DDF) or fetches data saved on the file system used by the user uploaded
Email reader task is configured as a 4 hourly crontask in the Charting Tool web app using Spring Scheduling. It uses IMAPS to check the email id to extract all .xls files for data sources such as Rail Data and other adhoc data sets such as Survey, Rig etc. These are sent in a pre-decided .xls formats to the pre-defined id, parsed and then loaded in the system’s database