Ph.D. Thesis Proposal: Deepal Jayasinghe

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

Event Details
  • Date/Time:
    • Thursday December 13, 2012 - Friday December 14, 2012
      9:00 am - 10:59 am
  • Location: KACB 3108
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact

Deepal Jayasinghe

Summaries

Summary Sentence: An automated Approach to Create, Manage and Analyze Large-scale Experiments for Elastic n-Tier Applications in Computing Clouds

Full Summary: No summary paragraph submitted.

Ph.D. Thesis Proposal Announcement
Title: An automated Approach to Create, Manage and Analyze Large-scale Experiments for Elastic n-Tier Applications in Computing Clouds

Deepal Jayasinghe
School of Computer Science
College of Computing
Georgia Institute of Technology

Date: Thursday December 13, 2012
Time: 10:00 AM - 12:00 PM EST
Location: Klaus 3108

Committee:

  • Professor Dr. Calton Pu, School of Computer Science (Advisor)
  • Professor Dr. Ling Liu, School of Computer Science
  • Professor Dr. Shamkant B. Navathe, School of Computer Science
  • Professor Dr. Ed Omiecinski, School of Computer Science
  • Professor Dr. Leo Mark, School of Computer Science


Summary:
One of the most significant distributed computing models to revolutionize the computing landscape is cloud computing. Clouds often are a cheaper alternative to traditional data centers because they provide on-demand, pay-as-you-go access to elastically scalable resources, and thus, many applications are now being migrated from data centers to private and public clouds; however, the transition to the cloud is not always straightforward and smooth. An application that performed one way in the on-premise data center may not perform identically on computing clouds because many variables including virtualization impact cloud performance. So companies should have a high-level of confidence that their applications deliver an expected level of performance and scalability in the present as well as the future.

For accurate performance predictions, the performance-relevant properties of cloud environments have to be reflected by the prediction models. Yet, properties like resource sharing (multi-tenancy), virtualization overhead (highly variable and load dependent), noise from neighbors and fluctuation of resource utilization are almost impractical to accommodate for prediction models.  We argue that a promising approach for addressing these challenges is to use exhaustive measurements from large scale experiments.

Large scale experiments and running them on cloud environments in particular introduces a set of challenging research issues.  The most viable approach for addressing the non-trivial complexities of large scale experimentation is the use of an automated approach despite the fact that manual and ad hoc (i.e., partially scripted) approaches are heavily being used for experimental measurements in traditional data centers. Automation removes the error prone and cumbersome involvement of human testers, reduces the burden of configuring and running large scale experiments for distributed applications, and accelerates the process of reliable applications testing.

We enable the automation through code generation. From an architectural viewpoint, our code generator adopts the compiler approach of multiple serial transformation stages; the hallmarks of this approach are that stages typically operate on an XML document that is the intermediate representation, and XSLT performs the code generation. Our automated approach to large scale experiments has enabled cloud experiments on modern clouds at a scale that is well beyond manual experimentation and has identified non-trivial performance phenomena that would not have been possible otherwise.

Additional Information

In Campus Calendar
No
Groups

College of Computing, School of Computer Science

Invited Audience
No audiences were selected.
Categories
No categories were selected.
Keywords
No keywords were submitted.
Status
  • Created By: Jupiter
  • Workflow Status: Published
  • Created On: Dec 4, 2012 - 4:56am
  • Last Updated: Oct 7, 2016 - 10:01pm