See the winners of the 2011 CI Days Poster Contest.
EGTAOnline: A Web App for Constructing and Analyzing Empirical Games
Numerical Investigations of Convection
daisync, A time-machine disk-based backup tool
MBNI Cloud+: High Performance Computing Optimized for Omics Data Analysis in Biomedical Research
Cosmic Sky Machine for the Dark Energy Survey
Scalar Field Model for Simulation of Human Evacuation
How Research Funding Affects Data Sharing
Improving Binding-Site Prediction with Residue Propensities
Quantitative Effects of Floor System on Progressive Collapse Resistance of Steel Frames
Visualization of Sound and Virtual Representation of the Roman Coliseum Using Beamforming
Precision Effects on Numerical Solutions to the Sine Gordon Equation
Visualization of the Human Motion Envelope
Cyberinfrastructure for Manufacturing Systems: Needs and Opportunities
Cyber Environment of Wireless Structural Health Monitoring System on Large-Scale Bridges
EGTAOnline: A Web App for Constructing and Analyzing Empirical Games
Ben-Alexander Cassell, Michael P. Wellman
Constructing empirical games from simulation can involve tremendous amounts of computation. As the size of a game grows exponentially with the number of players and strategies, even games of only moderate complexity have significant data gathering and management concerns. Existing cyberinfrastructure can address these issues, but we have found it very costly to train researchers to use currently available tools. Such training can involve learning a new scripting language and a database query language. In the face of this cost, researchers often choose to examine small games that can be constructed on their personal computers. To alleviate this problem, we have developed a web app and complementary analysis service. Our web app allows researchers to write their simulators in the programming language of their choice, requiring only that they subscribe to a simple file-based API for simulation configuration and data logging. Additionally, our web app lets researchers take advantage of the parallel computing capabilities of university-owned clusters without having to learn an arcane scripting language. As a result, our lab is able to examine larger games, with a higher throughput, than ever before.
Numerical Investigations of Convection
B.P. Cloutier, H. Johnson, B.K. Muite, P. Rigge, J.P Whitehead
We report on high resolution numerical studies of infinite Prandtl number convection using a simplified model with relevance to the motion of the Earth’s mantle. The simulations are done using pseudo spectral Fourier (x-direction) and Chebyshev methods (z-direction).The model uses the incompressible Navier-Stokes equations with the Boussinesq approximation and free-slip velocity boundary conditions that is driven solely by internal heating. We examine the transition from conduction to steady convection, to unsteady laminar convection, and lastly to chaotic convection. Some of this work was performed on Trestles (SDSC) provided by Teragrid resources support, award TG-CTS110010.
daisync, A time-machine disk-based backup tool
Manhong Dai, Tyler Brubaker, Heng Wang, Joshua Buckner, Fan Meng
MBNI’s tape backup solution consistently ran into following problems:
1. Each run took days to finish. Because of this, backup was done monthly or bi-weekly at best.
2. The system was very slow to recover files, both with requests for individual files and recovering entire servers.
3. Recovery had to be done by an administrator.
Given these problems and restrictions, we sought a solution based on hard drives rather than tape. We researched several existing software options that support hard disk backup. The problem with these solutions is that the backup data is not kept in a regular file format.
This became an especially pertinent issue because MBNI was going to expand its file servers. Hence we built white-box storage servers and developed a disk-based backup tool called ‘daisync’. daisync’s features are:
1. Time-machine style backup – each snapshot is a full backup.
2. Space saving by using the same space for an unmodified file across different snapshots.
3. Further space saving by linking unmodified files once moved to different folder.
4. Ability to backup both Windows and *nix file servers.
5. Recovery is as simple as getting a file from a Windows file server.
6. End users can retrieve backups on their own.
7. Granular permission control to forbid unauthorized access by an end-user.
8. Graphic reports for storage usage, backup time, etc.
daisync has now been in use by MBNI and the U of M Depression Center for three years. At MBNI, it backs up a 247T MBNI Cloud+, two 96T main file servers and a 20T image server. In addition, backup frequency has been increased to daily. daisync is released under GPL at http://daisync.sf.net.
MBNI Cloud+: High Performance Computing Optimized for Omics Data Analysis in Biomedical Research
Manhong Dai, Fan Meng
The analysis of deep sequencing and other high throughput genomic, transcriptomic, proteomic and metabolomic data from biomedical research demands powerful data processing capabilities. However, most existing high performance computing solutions, including Amazon EC2, are not optimized for the needs in omics data analysis, which usually involve large volumes of data but the related computation is often not CPU intensive. In addition, structural and functional annotations for various biological entities such as genes, genomic elements and metabolites are often used during the computation process, the integration of multiple data sets and the interpretation of analysis results. Consequently, it is necessary to integrate a database containing updated annotation information derived from multiple data sources with the computing solution. Moreover, the richness of information hidden in deep sequencing and other omics data usually require researchers to perform analyses from different perspectives thus large storage space that can accommodate data and analysis results for a longer period of time will be highly desirable. Another unique issue is sequencing data and most likely other omics data can be used to identify individuals, the data analysis environment will also need to be compliant with regulatory rules related to privacy protection and data security.
The MBNI Cloud+ is our effort at creating a highly scalable solution addressing omics data analysis needs. It currently has 140 computing cores, each with at least 8G memory access, as well as 274T file storage space and a 40T Oracle database in a secured environment. You are welcome to join the MBNI Cloud+ through our very cost-effective co-op model (http://cluster.mbni.med.umich.edu/ ).
How to Git Stuff Done at UM
Derek Dalle, Sean Torrez
When a file is shared between multiple users, there is a natural need to keep all files current and accessible to everyone. However, not all users should have equal rights to all files, and the process becomes more complicated when the files in question are source code. Ideally, changes would be updated in real time and distributed throughout the system, but this is difficult or impossible when static results are required (such as compiled code). Developers should be able to make changes and test them without affecting other users and still be able to distribute the changes once they have been tested. Collaborators should be able to come and go from the project as necessary. All of this should happen without much work or thought from the developer or other users.
This describes an ideal coding environment. In fact, most of the resources needed for this environment already exist at the University of Michigan. This poster describes an ongoing two-person collaboration using Git for version control, AFS for file sharing, and CTools for secure, private distribution of static results. Public projects are distributed using a web interface on top of Git. Use of these systems results in a remarkably flexible process that allows pair (or group) programming on separate computers, even if they are geographically separated.
Cosmic Sky Machine for the Dark Energy Survey
Brandon Erickson
We describe efforts by the Simulation Working Group (SimWG) of the Dark Energy Survey (DES) to develop an efficient workflow environment for the production of wide-area synthetic galaxy catalogs that include self-consistent gravitational shear. The COsmic Sky MAchine (COSMA) environment transforms multiple 10^{10}-particle N-body simulations of nested volumes into multi-band, catalog-level descriptions of galaxies covering the full sky to high redshift. Such catalogs serve as truth tables for science pipeline validation, and DES Science teams require multiple realizations covering different cosmologies to support a Blind Cosmology Challenge process now getting underway. We outline our processing steps, including required empirical input, and present initial validation tests of a LCDM catalog at $z~1$. We sketch efforts underway to integrate our codes with NSF XSEDE workflow and gateway tools, with the aim to reduce production time for a single cosmology, including N-body simulation generation, from months to weeks. By creating an efficient, portable framework for generating science-grade, synthetic galaxy catalogs, we hope to lay the groundwork for support of future optical surveys, such as LSST, whose large data volumes demand sophisticated simulations to extract the best possible science.
Scalar Field Model for Simulation of Human Evacuation
Jieshi Fang
Research on human behavior in emergencies depends heavily on simulation. However, existing evacuation models often lack enough realism to address various environments or conditions. A particular deficiency is the inability to accurately model the complex social networks that exist among evacuees. In order to create a more meaningful tool for studying human egress behavior, an agent-based model is built based on a scalar field method. Each agent in this model has stochastic characteristics and is independent. Using analogy to a particle in an electrical field, the behavior of each evacuee is governed by minimizing the sum of “virtual potential energies”, which are the path integrations of attractive and repulsive forces that simulate interactions between the agent and all surrounding agents and objects. By representing “thought” processes as a minimization problem, the proposed model is no longer rule based as most existing models are and is, therefore, able to handle conflicting emotions and outside influences in a natural and unified manner. The disadvantage is that the model is much more computationally expensive.
How Research Funding Affects Data Sharing
Karina Kervin, Margaret Hedstrom
In order to encourage interdisciplinary research, the National Institutes of Health (NIH) and the National Science Foundation (NSF) are mandating that researchers make their data public in an effort to provide incentives for data sharing. While this has encouraged data sharing in some fields, other fields with little NSF or NIH funding do not have the same incentives to encourage such sharing. In this work, we find that these other funding sources either fail to encourage data sharing and in some cases actively discourage it.
Improving Binding-Site Prediction with Residue Propensities
Nickolay A. Khazanov, Heather A. Carlson
Current structure-based drug design methods (SBDD) require a good understanding of general trends of protein-ligand interactions and the composition of binding sites. Residue propensities for protein-ligand binding sites were calculated from a large, curated set of high-quality, protein-ligand complexes available in the Binding MOAD database. Differences in composition were determined among binding sites of biologically relevant versus opportunistic ligands, and the robustness of these general trends was analyzed with respect to the data set size. The propensities of binding-site residues were used to improve the success rate of a geometry-based, binding-site prediction algorithm. Propensity-based scores can rank true binding sites as the top predictions better than the geometric criteria alone. The propensity-based scores are especially useful for locating the functional site in larger proteins, where multiple large pockets are predicted by geometric criteria.
This work demonstrates the utility of residue-propensity heuristics for improving SBDD methods that infer function from a protein’s surface pockets. Potential applications for residue composition heuristics include methods for knowledge-based binding-site prediction and binding-site classification/comparison. Binding-site compositions are being made available through the Binding MOAD database. Tools will be added to calculate composition statistics on desired subsets of protein-ligand complexes and to assess the statistical significance of the trends.
Treecode-Accelerated Boundary Integral Linear Poisson-Boltzmann Solver for Implicit Solvation of Biomolecules
Robert Krasny, Weihua Geng
Solvation of biomolecules is a challenging problem in computational biophysics. Models that track explicit solvent molecules are extremely costly, and implicit solvent models based on the Poisson-Boltzmann (PB) equation provide an efficient alternative for computing solvent-solute interactions. Even so, PB solvers still encounter numerical difficulties stemming from the discontinuous dielectric constant across the molecular surface, the boundary condition at spatial infinity,and the presence of charge singularities representing the biomolecule.
To address these issues, we present a linear PB solver employing a well-conditioned boundary integral formulation [1] and GMRES iteration accelerated by a treecode algorithm [2]. The accuracy and efficiency of the method are assessed for the Kirkwood sphere and a solvated protein (PDB:1A63). The present scheme has the features of relatively simple implementation, efficient memory usage, and straightforward parallelization. We compare numerical results for both the Poisson-Boltzmann and Poisson equations, using the proposed treecode-accelerated boundary integral solver, as well as the mesh-based Adaptive Poisson-Boltzmann (APBS) method [3].
[1] A. Juffer, E. Botta, B. van Keulen, A. van der Ploeg, H. Berendsen,The electric potential of a macromolecule in a solvent: a fundamental approach, J. Comput. Phys. 97 (1991) 144-171.
[2] P.J. Li, R. Krasny, H. Johnston, A Cartesian treecode for screened Coulomb interactions, J. Comput. Phys. 228 (2009) 3858-3868. [3] N.A. Baker, D. Sept, S. Joseph, M.J. Holst, J.A. McCammon, Electrostatics of nanosystems: application to microtubules and the ribosome, PNAS 98 (2001) 10037-10041.Dynamic Visualization and Simulation of Vertical and Horizontal Integrated Glare Control Blade System
Robin Li, Mathew Schwartz, Mojtaba Navvab
The benefits of having natural daylight within buildings is acknowledged by all architects and lighting designers. Most open office plan users do respond better when they can have access to an outside view, and consider their working environment to be better given the ability to control the natural daylight conditions. Lighting designers do their best to avoid glare from direct viewing of the sun’s rays within the work place, and try to create buildings that evoke public interest and satisfaction by the occupants. There is a need for an automatic control shading system for buildings with large areas of glazing that not only eliminates glare at all times, but also provides access to the outside view.
The proposed vertical and horizontal integrated blade system can be simple addition to an interior cavity of an exiting or a newly design window system. Through operating a newly developed algorithm for programming within the building automation system, the system allows the sun to be blocked in real time, while the outside view can be seen nearly everywhere within the office space. The capabilities of this integrated blade system are optimum performance by reducing glare or direct view of the sun rays while maintaining acceptable levels of natural daylight and view within the space. The associated benefits, if utilized and integrated with other building systems, are the reduction of cooling loads in summer and heating requirements in winter, the potential for creating aesthetic impact, and the integration of photovoltaic cells with opaque and or transparent materials.
Quantitative Effects of Floor System on Progressive Collapse Resistance of Steel Frames
Honghao Li, Sherif El-Tawil
This study investigates the role that floor systems play in the collapse resistance of steel structures. A 10-story, seismically designed steel building is used as a case study. The model employed in the simulation studies is a 3-D computational finite element model that accounts for the nonlinear, inelastic behavior of the most important components of the building. After validation, the simulation model is exercised to investigate system robustness when columns are forcibly removed, with specific emphasis on the role of the floor. The simulation studies shed light on the effect of composite action between the floor and underlying steel beams and catenary action in the floor components. The results show that the floor system contributes significantly to collapse response. It is also shown that the results of planar analyses that ignore the effect of the floor cannot always be viewed as conservative.
Visualization of Sound and Virtual Representation of the Roman Coliseum Using Beamforming
Mojtaba Navvab, Fabio Bisegna, Gunnar Heilmann, Magdalena Böck
The acoustic properties of ancient performance spaces for Greek and Roman theaters have been studied for accurate reconstruction from possible alternatives of material and design evolution by many investigators. Parametric studies and examination of computer simulation methodology for ancient theaters provides new indexes to examine the contribution of each of the design components. Measured and simulated results show that scattering and diffraction from seat and architectural elements, which are important in outdoors theaters, impact the sound quality and condition. The specific changes in material characteristic have increased the reverberation and enhanced the sound levels. Computer simulations using a range of boundary absorption and scattering coefficients play a very important role in supporting the choice of the best, or almost the more acceptable reconstruction, or sustainable design approach among different possible alternatives being practiced by superintendents and the managers of these historical sites.
This research represents an international collaboration in research among USA, Italy and Germany and application of a newly developed technique in beamforming as a close numerical examination to put in evidence the relevant acoustical aspects of ancient theaters basing the study on the comparison of ancient and modern structures. Application of the CAVE, or Virtual Reality laboratory, provides a well-established tool for this task. Simulations have been carried out to evaluate the acoustics of the orchestra, of the cavea and of the stage, using the theater of Ancient Ostia and the Rome Coliseum theater as a reference for ancient theater. The use of virtual reality and virtual reconstructions of these theater combined with auralization techniques, provide the opportunities not only to investigate performance of these theaters in different eras, but also provide a different experience for the users within the virtual world of ancient acoustics given the growing availability of computers and their capability for not only visualization, but also virtual acoustics.
Using High Performance Computing to Study the Role of Symmetry in Electron Transfer for Photovoltaic Materials via Density Functional Theory
Heidi Phillips, Eitan Geva, Barry Dunietz
Light harvesting relies on photo-induced charge transfer and can be studied through electronic structure calculations. Recently however, conventional Density Functional Theory (DFT) has been shown to fail in calculating excited state properties of charge transfer systems. Novel range-separated density functionals have been developed which are able to accurately treat excited states with charge transfer character. In this study, we implement these novel functionals to elucidate the effect of symmetry on the charge transfer states of two molecular systems: a model ethene dimer system, and a dye-functionalized silsesquioxane molecular system that has photovoltaic applications. We compare the TDDFT excitation energies calculated using conventional local density approximation, generalized-gradient approximation, and hybrid functionals, to those of a novel range-separated functional, BNL. Calculations were performed using the Q-Chem Program Package version 4.0. We utilize three types of calculations: Geometry optimizations, frequencies of normal modes, and single point energy TDDFT. Calculations were run on the Flux cluster.
Precision Effects on Numerical Solutions to the Sine Gordon Equation
Paul Rigge, Benson K. Muite
We examine numerical precision effects for the Sine-Gordon equation. We implement high order implicit Runge Kutta solvers using fixed-point iteration and compare diagonally and fully implicit schemes. We find that in quadruple precision, fourteenth order time stepping schemes are very efficient.
Visualization of the Human Motion Envelope
Mathew Schwartz, Janani Viswanathan
The use of human ergonomics diagramming has been pivotal in almost all design. Fields from engineering to art have generally used the same two-dimensional ergonomic diagrams. The accessibility of computational 3d modeling tools has brought a new level of sophistication to the design field while the ergonomic diagrams have remained largely unchanged. The few examples of alternative visualizations of ergonomics are largely proprietary and made for specific purposes such as human interactions in automobiles. With standards in design and architecture such as the Americans with Disabilities Act, designers are required to know more about the way in which people move and the limitations of their movement. This research looks at alternative methods for creating ergonomic diagrams as well as using motion capture and inverse kinematics to allow designers a three-dimensional visualization of human motion envelope in relation to their work. These two techniques are done in parallel allowing a comparison of the results. As industrial chair design is largely based on these two-dimensional ergonomic drawings, this work uses the standard chair as a base case for demonstrating the power of a multidimensional visualization system. The use of a chair as a test case does not however limit the application of the work. Spatial design in fields such as NASA, jails, submarines, and pod hotels all benefit from a more thorough understanding of the human envelope. While the current work deals specifically with the human envelope it has been setup for additional factors to be incorporated in the visualization. Physiological and psychological factors can be integrated into the system allowing for auditory and optical information to be displayed.
The Probabilistic Modeling of Microstructure Evolution Using Finite Element Representation of Probability Density Functions
Shang Sun, Veera Sundararaghavan
Polycrystalline plasticity theories and finite element methods for analyzing microstructures are fairly well developed to simulate the mechanism and predict distribution of material properties in the component, but they are too costly to implement in industrial microstructure modeling because of the high computational demand. Also considering the stochastic nature of polycrystalline microstructures, more and more scholars are turning to probabilistic field to seek descriptors to represent and compute the evolution of microstructure. A probabilistic finite descriptor is presented here for simulating evolution of polycrystalline microstructures during deformation. The microstructure is described using four dimensional probability density function (4DPDF), defined as the probability density of occurrence of a crystal orientation g???, distance r and the other orientation g. The 4DPDF is represented using four dimensional finite element meshes in the g???, r and g spaces. As the microstructure evolves, the reoriented neighborhood and strain field is captured by updating probability fields in these finite element meshes. For this purpose, a novel total Lagrangian approach has been developed that allows evolution of probability densities while satisfying normalization constraints, probability interdependencies and symmetries. The improvement in prediction of texture and strains achieved by the 4DPDF approach is quantified through deformation analysis of a planar polycrystalline microstructure.
Cyberinfrastructure for Manufacturing Systems: Needs and Opportunities
J. Viswanathan, J. Hu, Z. Mao, D. Tilbury
The advent of Internet facilitated the development of e-businesses using web-based services leveraging various back-end database systems, inventory systems and payment processing systems. Such online retail businesses can be set up in a relatively short period of time. In contrast, set-up of a manufacturing system takes much longer, since the system design involves various stages from gaining an understanding of the product and its market to machine selection, layout, control logic, part flow analysis and so on. Cyberinfrastructure has the potential to revolutionize the design and operation of manufacturing systems. The goal of the project is to understand the requirements of manufacturing cyber infrastructure in terms of the data standards, protocol and architecture and realize how cyber-infrastructure technology can be utilized to create a collaborative environment distributed across a number of physical locations through web-based virtual communities, thereby simplifying and streamlining the manufacturing process. Existing works on cyberinfrastructure for e-commerce and other domains will be leveraged to understand technology that can be applied to manufacturing systems either directly, with minor extensions or requiring a paradigm shift. This poster will summarize our findings in the project thus far and invite comments for future investigations or collaborations that could lead towards the vision of a manufacturing cyberinfrastructure.
Cyber Environment of Wireless Structural Health Monitoring System on Large-Scale Bridges
Yilan Zhang, Masahiro Kurata, Jerome P. Lynch
This poster shows on the progress of constructing cyberinfrastructure tools which hierarchically controls multiple sub-networks of wireless sensors deployed at a long-span bridge. The scalability and long-term robustness of the proposed cyberinfrastructure have been evaluated utilizing wireless sensor sub-networks permanently deployed at the New Carquinez Suspension Bridge in Vallejo, CA. Under development are many data interrogation tools including an automated sensor logging tool for the detection of fault sensors, an automated system identification tool for the long- term monitoring on variation of bridge modal properties, and a model-updating tool for verifying and updating a high-fidelity finite element model of the bridge.