- Scientific Computing and Data Management
- Distributed and Parallel Computing Systems
- Parallel Computing and Optimization Techniques
- Research Data Management Practices
- Meteorological Phenomena and Simulations
- Radiomics and Machine Learning in Medical Imaging
- Advanced X-ray and CT Imaging
- Digital Radiography and Breast Imaging
- NMR spectroscopy and applications
- Environmental Monitoring and Data Management
- Advanced MRI Techniques and Applications
- Advanced Data Storage Technologies
- Computational Physics and Python Applications
- Cloud Computing and Resource Management
- Copper Interconnects and Reliability
- Geological Modeling and Analysis
- Advanced Computational Techniques and Applications
- Advanced NMR Techniques and Applications
- Electrodeposition and Electroless Coatings
- Mobile Agent-Based Network Management
- Corrosion Behavior and Inhibition
- Evaluation of Teaching Practices
- Video Analysis and Summarization
- Anodic Oxide Films and Nanostructures
- Liver Disease Diagnosis and Treatment
University of Illinois Urbana-Champaign
2005-2023
National Center for Supercomputing Applications
1997-2018
National Center for Supercomputing Applications
2002-2016
St. Francis Medical Center
1991
Within a decade after John von Neumann and colleagues conducted the first experimental weather forecast on ENIAC computer in late 1940s, numerical models of atmosphere became foundation modern-day forecasting one driving application areas science. This article describes research that is enabling major shift toward dynamically adaptive responses to rapidly changing environmental conditions.
Abstract We review the efforts of Open Grid Computing Environments collaboration. By adopting a general three‐tiered architecture based on common standards for portlets and Web services, we can deliver numerous capabilities to science gateways from our diverse constituent efforts. In this paper, discuss support standards‐based using Velocity development environment. Our are abstraction layers provided by Java CoG kit, which hide differences different toolkits. Sophisticated services...
Abstract Copper electrodeposition in submicron trenches involves phenomena that span many orders of magnitude time and length scales. In the present work, two codes simulate electrochemical on different scales were externally linked. A Monte Carlo code simulated surface order to resolve roughness evolution trench in‐fill. 2‐D finite difference transport diffusion boundary layer outside trench. The continuum passed fluxes code, which back a concentration code. numerical instability arises...
This work describes an approach to building Grid applications based on the premise that users who wish access and run these prefer do so without becoming experts technology. We describe application architecture wrapping user workflows as Web services service resources. These are visible resource providers through a family of portal components can be used configure, launch, monitor complex in scientific language end user. The this model instantiated by factory service. layered design makes it...
Sophisticated data-distribution schemes and recent developments in sensors instruments that can monitor the lower kilometers of atmosphere at high levels resolution have rapidly expanded quantity information available to mesoscale meteorology. The myLEAD personalized information-management tool helps geoscience users make sense this vastly space. MyLEAD extends general globus metadata catalog service leverages a well-known extensible schema. Its orientation makes it an active player...
During the initial stages of copper electrodeposition onto a thin seed layer, nonuniform potential distribution arises, resulting in local variations growth rate and deposit morphology. Early morphology evolution during are practical importance but have not been well studied. Here, new multiscale approach is developed for numerical simulation effect macroscopic along layer on microscopic roughness evolution. The key contribution generic method coupling multiple computer codes, demonstration...
The design and prototype implementation of the XCAT Grid Science Portal is described in this paper. portal lets grid application programmers easily script complex distributed computations package these applications with simple interfaces for others to use. Each packaged as a "notebook" which consists web pages editable parameterized scripts. workstation-based specialized "personal" server, capable executing scripts launching remote user. server can receive event streams published by resource...
The superparamagnetic particle dextran magnetite was studied as a liver tumor contrast agent for magnetic resonance imaging (MRI). effects of on the longitudinal (T1) and transverse (T2) relaxation times in liver, spleen, an implanted rat were measured at 0.47 T (IBM/Bruker PC-20 relaxometer) over dose range 23 to 69 mumol Fe/kg. Dextran substantially reduced T2 but not tumor, thereby providing basis improved imaging. T1 affected following injection studied, while spleen more than liver....
We present Brown Dog, two highly extensible services that aim to leverage any existing pieces of code, libraries, services, or standalone software (past present) towards providing users with a simple use and programmable means automated aid in the curation indexing distributed collections uncurated and/or unstructured data. Data such as these encompassing large varieties data, addition amounts pose significant challenge within modern day "Big Data" efforts. The Access Proxy (DAP) Tilling...
This paper describes the design and prototype implementation of XCAT Grid Science Portal. The portal lets grid application programmers script complex distributed computations package these applications with simple interfaces for others to use. Each is packaged as a notebook which consists web pages editable parameterized scripts. workstation‐based specialized personal server, capable executing scripts launching remote user. server can receive event streams published by resource information...
Brown Dog is a data transformation service for auto-curation of long-tail data. In this digital age, we have more available analysis than ever and trend will only increase. According to most estimates, 70--80% unstructured, together with unsupported formats inaccessible software tools, in essence, not either easily accessible or usable its owners meaningful way. aims at making by indexing, leveraging existing novel tools. paper, discuss the recent major component improvements including tools...
As parallel computing grows and becomes an essential part of computer science, tools must be developed to help grade assignments for large courses, especially with the prevalence Massive Open Online Courses (MOOCs) increasing in recent years. This paper describes some general challenges related building autograder code suggestions sample design decisions covering presented assignments. The explores results experiences from using these autograders enable XSEDE 2013 2014 Parallel Computing...
Brown Dog is an extensible data cyberinfrastructure, that provides a set of and distributed conversion metadata extraction services to enable access search within unstructured, un-curated inaccessible research across different domains sciences social science, which ultimately aids in supporting reproducibility results. We envision Dog, as essential service comprehensive cyberinfrastructure includes services, high performance computing more would scholarly variety disciplines today not yet...
Grouping game players based on their online behaviors has attracted a lot of attention recently. However, due to the huge volume and extreme complexity in data collections, grouping is challenging task. This study applied parallelized K-Means Gordon, supercomputer hosted at San Diego Supercomputer Center, meet computational challenge this By using parallelization functions supported by R, was able cluster 120,000 into eight non-overlapping groups speed up clustering process one four times...
Eclipse [1] is a widely used, open source integrated development environment that includes support for C, C++, Fortran, and Python. The Parallel Tools Platform (PTP) [2] extends to on high performance computers. PTP allows the user run her laptop, while code compiled, run, debugged, profiled remote HPC system. provides assistance MPI, OpenMP, UPC; it users submit jobs batch system monitor job queue. It also visual parallel debugger.
Three-dimensional microscopic NMR images of spleen and liver specimens from rats injected with dextran magnetite particles controls were obtained at 4.7 T, using a specially designed probe in conjunction 3D filtered back projection reconstruction algorithm. All the reconstructed as 64(3) arrays (25 microns) 3 isotropic voxels. With aid MR contrast agent, red pulp marginal zone portal triad could be distinguished surrounding tissue T2-weighted images. For mature rat spleen, natural was found...
An XSEDE strategic goal is to extend use of high-end digital services new communities by preparing the current and next generation scholars, researchers, engineers in advanced technologies via training, education, outreach. The mission XSEDE's Under-Represented Community Engagement (URCE) program raise awareness value research recruit users from communities. In collaboration with training education programs, URCE works faculty students that are non-traditional resources helps them utilize ecosystem.