- Scientific Computing and Data Management
- Distributed and Parallel Computing Systems
- Data Visualization and Analytics
- Research Data Management Practices
- Multimedia Communication and Technology
- Online Learning and Analytics
- Public Health in Brazil
- Software System Performance and Reliability
- Ruminant Nutrition and Digestive Physiology
- Botany and Plant Ecology Studies
- Health, Nursing, Elderly Care
- Genetics and Plant Breeding
- Botanical Research and Applications
- Computer Graphics and Visualization Techniques
- Data Management and Algorithms
- Plant Taxonomy and Phylogenetics
- Pasture and Agricultural Systems
- Graphite, nuclear technology, radiation studies
- Video Analysis and Summarization
- Growth and nutrition in plants
- Crime Patterns and Interventions
- Agricultural Science and Fertilization
- Banana Cultivation and Research
- Environmental Monitoring and Data Management
- Advanced Data Storage Technologies
Universidade de Brasília
2024
Universidade Federal do Ceará
2002-2023
University of Alberta
2023
Food & Nutrition
2023
Universidade Estadual de Montes Claros
2018
SUNY Polytechnic Institute
2012
New York University
2012
University of Utah
2006-2011
University of Southampton
2007
Argonne National Laboratory
2007
The problem of systematically capturing and managing provenance for computational tasks has recently received significant attention because its relevance to a wide range domains applications. authors give an overview important concepts related management, so that potential users can make informed decisions when selecting or designing solution.
Scientists are now faced with an incredible volume of data to analyze. To successfully analyze and validate various hypothesis, it is necessary pose several queries, correlate disparate data, create insightful visualizations both the simulated processes observed phenomena. Often, insight comes from comparing results multiple visualizations. Unfortunately, today this process far interactive contains many error-prone time-consuming tasks. As a result, generation maintenance major bottleneck in...
We present release 2.0 of the ALPS (Algorithms and Libraries for Physics Simulations) project, an open source software project to develop libraries application programs simulation strongly correlated quantum lattice models such as magnets, bosons, fermion systems. The code development is centered on common XML HDF5 data formats, simplify speed up development, evaluation plotting tools, programs. enable non-experts start carrying out serial or parallel numerical simulations by providing basic...
Abstract The first Provenance Challenge was set up in order to provide a forum for the community understand capabilities of different provenance systems and expressiveness their representations. To this end, functional magnetic resonance imaging workflow defined, which participants had either simulate or run produce some representation, from identified queries be implemented executed. Sixteen teams responded challenge, submitted inputs. In paper, we present challenge queries, summarize...
Abstract VisTrails is a new workflow and provenance management system that provides support for scientific data exploration visualization. Whereas workflows have been traditionally used to automate repetitive tasks, applications are exploratory in nature, change the norm. uses change‐based mechanism, which was designed handle rapidly evolving workflows. It uniformly automatically captures information products evolution of generate these products. In this paper, we describe how organized...
As publishers establish a greater online presence as well infrastructure to support the distribution of more varied information, idea an executable paper that enables interaction has developed. An provides information for computational experiments and results than text, tables, figures standard papers. Executable papers can bundle content allow readers reviewers interact, validate, explore experiments. By including such content, authors facilitate future discoveries by lowering barrier...
Abstract Over the last 20 years, visualization courses have been developed and offered at universities around world. Many of these use established libraries tools (e.g. VTK, ParaView, AVS, VisIt) as a way to provide students hands‐on experience, allowing them prototype explore different techniques. In this paper, we describe our experiences using VisTrails platform teach scientific visualization. is an open‐source system that was designed support exploratory computational tasks such data...
Visualization is essential for understanding the increasing volumes of digital data. However, process required to create insightful visualizations involved and time consuming. Although several visualization tools are available, including with sophisticated visual interfaces, they out reach users who have little or no knowledge techniques and/or do not programming expertise. In this paper, we propose VisMashup, a new framework streamlining creation customized applications. Because these...
The Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) is a new tool for analyzing and visualizing climate data. Here we provide some pointers, background information, examples to show how the system works.
In recent years, violence has considerably increased in the world. a certain state of Brazil, for example, homicide rate grew from 16 homicides per 100,000 inhabitants 2000, to 48 2014. Police departments worldwide use various types crime maps, which are generated with diverse techniques, order analyze and fight crime. Those maps enable decision makers identify high-risk areas allocate resources more effectively. Hotspot particular, often available visual interactive systems analysis....
Data analysis tasks at an Ocean Observatory require integrative and domain-specialized use of database, workflow, visualization systems. We describe a platform to support these developed as part the cyberinfrastructure NSF Science Technology Center for Coastal Margin Observation Prediction integrating provenance-aware workflow system, 3D visualization, remote query engine large-scale ocean circulation models. show how disparate tools complement each other give examples real scientific...
This study assessed the prevalence of visceral leishmaniasis in blood donors from three endemic regions Brazil and evaluated risk transmission by transfusion.Despite strong evidence through transfusion, real risk, an essential condition for taking effective measures to control this serious disease, has not been determined.A multicentre was performed highly areas. Candidates eligible their first donation underwent a socio-epidemiological interview, samples were collected enzyme-linked...
Abstract Computer‐based technology has played a significant role in crime prevention over the past 30 years, especially with popularization of spatial databases and mapping systems. Police departments frequently use hotspot analysis to identify regions that should be priority receiving preventive resources. Practitioners researchers agree tracking time identifying its geographic patterns are vital information for planning efficiently. Frequently, police have access systems too complicated...
Simulations that require massive amounts of computing power and generate tens terabytes data are now part the daily lives scientists. Analyzing visualizing results these simulations as they computed can lead not only to early insights but also useful knowledge be provided feedback simulation, avoiding unnecessary use power. Our work is aimed at making advanced visualization tools available scientists in a user-friendly, Web-based environment where accessed anytime from anywhere. In context...
In this paper, we present a student dropout prediction strategy based on the classification with reject option paradigm. such strategy, our method classifies students into prone or non-dropout classes and may also classifying when algorithm does not provide reliable prediction. The rejected are ones that could be classified either class, so probably more chances of success subjected to personalized intervention activities. proposed method, zone can adjusted number meet available workforce...
Program comprehension is a fundamental activity in software maintenance and evolution, impacting several tasks such as bug fixing, code reuse, implementation of new features. The Atom Confusion (AC) considered the smallest piece that can confuse programmers, difficulting correct understanding source under consideration. Previous studies have shown these atoms significantly impact presence bugs C++ projects increase time effort to Java programs. To gather more evidence about diffusion ACs...
Collaboratively monitoring and analyzing large scale simulations from petascale computers is an important area of research development within the scientific community. This paper addresses these issues when teams colleagues different areas work together to help understand complex data generated simulations. In particular, we address geographically diverse disparate researchers science being simulated on high performance computers. Most application scientists want focus sciences spend a...