- Complex Network Analysis Techniques
- Parallel Computing and Optimization Techniques
- Advanced Graph Neural Networks
- Distributed and Parallel Computing Systems
- Interconnection Networks and Systems
- Opinion Dynamics and Social Influence
- Simulation Techniques and Applications
- Graph Theory and Algorithms
- Data Management and Algorithms
- Network Security and Intrusion Detection
- Advanced Clustering Algorithms Research
- Scientific Computing and Data Management
- Advanced Data Storage Technologies
- Information and Cyber Security
- Caching and Content Delivery
- Peer-to-Peer Network Technologies
- Anomaly Detection Techniques and Applications
- Smart Grid Security and Resilience
- Topological and Geometric Data Analysis
- Opportunistic and Delay-Tolerant Networks
- Power System Reliability and Maintenance
- Web Data Mining and Analysis
- Infrastructure Resilience and Vulnerability Analysis
- Stochastic processes and statistical mechanics
- Optimal Power Flow Distribution
Sandia National Laboratories
2013-2024
Sandia National Laboratories California
2013-2023
Lawrence Berkeley National Laboratory
2002-2008
University of California, Berkeley
2008
Lawrence Livermore National Laboratory
2005
University of Illinois Urbana-Champaign
1999
Article Improving performance of sparse matrix-vector multiplication Share on Authors: Ali Pinar Department Computer Science and Center Simulation Advanced Rockets, University Illinois at Urbana-Champaign Urbana-ChampaignView Profile , Michael T. Heath Authors Info & Claims SC '99: Proceedings the 1999 ACM/IEEE conference SupercomputingJanuary Pages 30–eshttps://doi.org/10.1145/331532.331562Online:01 January 1999Publication History 108citation1,183DownloadsMetricsTotal Citations108Total...
Network data is ubiquitous and growing, yet we lack realistic generative network models that can be calibrated to match real-world data. The recently proposed block two-level Erdös--Rényi (BTER) model tuned capture two fundamental properties: degree distribution clustering coefficients. latter particularly important for reproducing graphs with community structure, such as social networks. In this paper, compare BTER other scalable show it gives a better fit real We provide implementation...
Efficient design of hardware and software for large-scale parallel execution requires detailed understanding the interactions between application, computer, network. The authors have developed a macro-scale simulator (SST/macro) that permits coarse-grained study distributed-memory applications. In presented work, applications using Message Passing Interface (MPI) are simulated; however, is designed to allow inclusion other programming models. driven from either trace file or skeleton...
Graphs are used to model interactions in a variety of contexts, and there is growing need quickly assess the structure graph. Some most useful graph metrics, especially those measuring social cohesion, based on triangles. Despite importance these triadic measures, associated algorithms can be extremely expensive. We propose new method wedge sampling. This versatile technique allows for fast accurate approximation all current variants clustering coefficients enables rapid uniform sampling...
Discovering dense subgraphs and understanding the relations among them is a fundamental problem in graph mining. We want to not only identify subgraphs, but also build hierarchy (e.g., larger sparser formed by two smaller subgraphs). Peeling algorithms ( k -core, -truss, nucleus decomposition) have been effective locate many subgraphs. However, constructing hierarchical representation of density structure, even correctly computing connected -cores -trusses, mostly overlooked. Keeping track...
One of the most influential recent results in network analysis is that many natural networks exhibit a power-law or log-normal degree distribution. This has inspired numerous generative models match this property. However, more work shown while these do have right distribution, they are not good for real-life due to their differences on other important metrics like conductance. We believe is, part, because real-world very different joint distributions , probability randomly selected edge...
We design a space-efficient algorithm that approximates the transitivity (global clustering coefficient) and total triangle count with only single pass through graph given as stream of edges. Our procedure is based on classic probabilistic result, birthday paradox . When constant there are more edges than wedges (common properties for social networks), we can prove our requires O (√ n ) space ( number vertices) to provide accurate estimates. run detailed set experiments variety real graphs...
The path toward realizing next-generation petascale and exascale computing is increasingly dependent on building supercomputers with unprecedented numbers of processors. To prevent the interconnect from dominating overall cost these ultrascale systems, there a critical need for scalable interconnects that capture communication requirements applications. It is, therefore, essential to understand high-end application characteristics across broad spectrum computational methods, utilize insight...
We consider the problem of designing (or augmenting) an electric power system at a minimum cost such that it satisfies N-k-ε survivability criterion. This criterion is generalization well-known N-k criterion, and requires least (1-ε <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">j</sub> ) fraction steady-state demand be met after failures j components, for j=0,1,...,k. The network design adds another level complexity to notoriously hard...
Graph analysis is playing an increasingly important role in science and industry. Due to numerous limitations sharing real-world graphs, models for generating massive graphs are critical developing better algorithms. In this paper, we analyze the stochastic Kronecker graph model (SKG), which foundation of Graph500 supercomputer benchmark due its favorable properties easy parallelization. Our goal provide a deeper understanding parameters so that functionality as increased. We develop...
Graph analysis is playing an increasingly important role in science and industry. Due to numerous limitations sharing real-world graphs, models for generating massive graphs are critical developing better algorithms. In this paper, we analyze the stochastic Kronecker graph model (SKG), which foundation of Graph500 supercomputer benchmark due its many favorable properties easy parallelization. Our goal provide a deeper understanding parameters so that functionality as increased. We develop...
Many scientific applications generate massive volumes of data through observations or computer simulations, bringing up the need for effective indexing methods efficient storage and retrieval data. Unlike conventional databases, is mostly read-only its volume can reach to order petabytes, making a compact index structure vital. Bitmap has been successfully applied databases by exploiting fact that are enumerated numerical. indices be compressed with valiants run length encoding structure....
Previous chapter Next Full AccessProceedings Proceedings of the 2012 SIAM International Conference on Data Mining (SDM)The Similarity between Stochastic Kronecker and Chung-Lu Graph ModelsAli Pinar, C. Seshadhri, Tamara G. KoldaAli Koldapp.1071 - 1082Chapter DOI:https://doi.org/10.1137/1.9781611972825.92PDFBibTexSections ToolsAdd to favoritesExport CitationTrack CitationsEmail SectionsAboutAbstract The analysis massive graphs is now becoming a very important part science industrial research....
As we enter the era of peta-scale computing, system architects must plan for machines composed tens or even hundreds thousands processors. Although fully connected networks such as fat-tree configurations currently dominate HPC interconnect designs, approaches are inadequate ultra-scale concurrencies due to superlinear growth component costs. Traditional low-degree topologies, 3D tori, have reemerged a competitive solution linear scaling components relative node count; however, poorly suited...
The concept of k-cores is important for understanding the global structure networks, as well identifying central or nodes within a network. It often valuable to understand resilience network attacks and dropped edges (i.e., damaged communications links). We provide formal definition network»s core resilience, examine problem characterizing in terms structural features: particular, which properties cause have high low resilience? To measure this, we introduce two novel node properties,Core...
The increasing complexity of both scientific simulations and high-performance computing system architectures are driving the need for adaptive workflows, in which composition execution computational data manipulation steps dynamically depend on evolutionary state simulation itself. Consider, example, frequency storage. Critical phases should be captured with high fidelity postanalysis; however, we cannot afford to retain same full due cost movement. We can instead look triggers, indicators...
Protecting against multi-step attacks of uncertain start times and duration forces the defenders into indefinite, always ongoing, resource-intensive response. To allocate resources effectively, defender must analyze respond to an stream potentially undetected multiple take measures attack response intensity over time account. Such requires estimation overall success metrics evaluating effect strategies actions associated with specific steps on metrics. We present a novel game-theoretic...
Markov chains (MCs) are convenient means of generating realizations networks with a given (joint or otherwise) degree distribution (DD), since they simply require procedure for rewiring edges. The major challenge is to find the right number steps run such chain, so that we generate truly independent samples. Theoretical bounds mixing times these MCs too large be practically useful. Practitioners have no useful guide choosing length, and tend pick numbers fairly arbitrarily. We give...
Abstract Background The efficient biological production of industrially and economically important compounds is a challenging problem. Brute-force determination the optimal pathways to target chemical in chassis organism computationally intractable. Many current methods provide single solution this problem, but fail all pathways, optional sub-optimal solutions or hybrid biological/non-biological solutions. Results Here we present RetSynth, software with novel algorithm for determining given...