May 5, 2019

3155 words 15 mins read

Paper Group ANR 560

Paper Group ANR 560

Contextual RNN-GANs for Abstract Reasoning Diagram Generation. A Survey of Voice Translation Methodologies - Acoustic Dialect Decoder. Formal Concept Analysis of Rodent Carriers of Zoonotic Disease. Measuring the Quality of Exercises. Mobile Big Data Analytics Using Deep Learning and Apache Spark. Enabling Basic Normative HRI in a Cognitive Robotic …

Contextual RNN-GANs for Abstract Reasoning Diagram Generation

Title Contextual RNN-GANs for Abstract Reasoning Diagram Generation
Authors Arnab Ghosh, Viveka Kulharia, Amitabha Mukerjee, Vinay Namboodiri, Mohit Bansal
Abstract Understanding, predicting, and generating object motions and transformations is a core problem in artificial intelligence. Modeling sequences of evolving images may provide better representations and models of motion and may ultimately be used for forecasting, simulation, or video generation. Diagrammatic Abstract Reasoning is an avenue in which diagrams evolve in complex patterns and one needs to infer the underlying pattern sequence and generate the next image in the sequence. For this, we develop a novel Contextual Generative Adversarial Network based on Recurrent Neural Networks (Context-RNN-GANs), where both the generator and the discriminator modules are based on contextual history (modeled as RNNs) and the adversarial discriminator guides the generator to produce realistic images for the particular time step in the image sequence. We evaluate the Context-RNN-GAN model (and its variants) on a novel dataset of Diagrammatic Abstract Reasoning, where it performs competitively with 10th-grade human performance but there is still scope for interesting improvements as compared to college-grade human performance. We also evaluate our model on a standard video next-frame prediction task, achieving improved performance over comparable state-of-the-art.
Tasks Video Generation
Published 2016-09-29
URL http://arxiv.org/abs/1609.09444v2
PDF http://arxiv.org/pdf/1609.09444v2.pdf
PWC https://paperswithcode.com/paper/contextual-rnn-gans-for-abstract-reasoning
Repo
Framework

A Survey of Voice Translation Methodologies - Acoustic Dialect Decoder

Title A Survey of Voice Translation Methodologies - Acoustic Dialect Decoder
Authors Hans Krupakar, Keerthika Rajvel, Bharathi B, Angel Deborah S, Vallidevi Krishnamurthy
Abstract Speech Translation has always been about giving source text or audio input and waiting for system to give translated output in desired form. In this paper, we present the Acoustic Dialect Decoder (ADD) - a voice to voice ear-piece translation device. We introduce and survey the recent advances made in the field of Speech Engineering, to employ in the ADD, particularly focusing on the three major processing steps of Recognition, Translation and Synthesis. We tackle the problem of machine understanding of natural language by designing a recognition unit for source audio to text, a translation unit for source language text to target language text, and a synthesis unit for target language text to target language speech. Speech from the surroundings will be recorded by the recognition unit present on the ear-piece and translation will start as soon as one sentence is successfully read. This way, we hope to give translated output as and when input is being read. The recognition unit will use Hidden Markov Models (HMMs) Based Tool-Kit (HTK), hybrid RNN systems with gated memory cells, and the synthesis unit, HMM based speech synthesis system HTS. This system will initially be built as an English to Tamil translation device.
Tasks Speech Synthesis
Published 2016-10-13
URL http://arxiv.org/abs/1610.03934v1
PDF http://arxiv.org/pdf/1610.03934v1.pdf
PWC https://paperswithcode.com/paper/a-survey-of-voice-translation-methodologies
Repo
Framework

Formal Concept Analysis of Rodent Carriers of Zoonotic Disease

Title Formal Concept Analysis of Rodent Carriers of Zoonotic Disease
Authors Roman Ilin, Barbara A. Han
Abstract The technique of Formal Concept Analysis is applied to a dataset describing the traits of rodents, with the goal of identifying zoonotic disease carriers,or those species carrying infections that can spillover to cause human disease. The concepts identified among these species together provide rules-of-thumb about the intrinsic biological features of rodents that carry zoonotic diseases, and offer utility for better targeting field surveillance efforts in the search for novel disease carriers in the wild.
Tasks
Published 2016-08-25
URL http://arxiv.org/abs/1608.07241v1
PDF http://arxiv.org/pdf/1608.07241v1.pdf
PWC https://paperswithcode.com/paper/formal-concept-analysis-of-rodent-carriers-of
Repo
Framework

Measuring the Quality of Exercises

Title Measuring the Quality of Exercises
Authors Paritosh Parmar, Brendan Tran Morris
Abstract This work explores the problem of exercise quality measurement since it is essential for effective management of diseases like cerebral palsy (CP). This work examines the assessment of quality of large amplitude movement (LAM) exercises designed to treat CP in an automated fashion. Exercise data was collected by trained participants to generate ideal examples to use as a positive samples for machine learning. Following that, subjects were asked to deliberately make subtle errors during the exercise, such as restricting movements, as is commonly seen in cases of patients suffering from CP. The quality measurement problem was then posed as a classification to determine whether an example exercise was either “good” or “bad”. Popular machine learning techniques for classification, including support vector machines (SVM), single and doublelayered neural networks (NN), boosted decision trees, and dynamic time warping (DTW), were compared. The AdaBoosted tree performed best with an accuracy of 94.68% demonstrating the feasibility of assessing exercise quality.
Tasks
Published 2016-08-31
URL http://arxiv.org/abs/1608.09005v1
PDF http://arxiv.org/pdf/1608.09005v1.pdf
PWC https://paperswithcode.com/paper/measuring-the-quality-of-exercises
Repo
Framework

Mobile Big Data Analytics Using Deep Learning and Apache Spark

Title Mobile Big Data Analytics Using Deep Learning and Apache Spark
Authors Mohammad Abu Alsheikh, Dusit Niyato, Shaowei Lin, Hwee-Pink Tan, Zhu Han
Abstract The proliferation of mobile devices, such as smartphones and Internet of Things (IoT) gadgets, results in the recent mobile big data (MBD) era. Collecting MBD is unprofitable unless suitable analytics and learning methods are utilized for extracting meaningful information and hidden patterns from data. This article presents an overview and brief tutorial of deep learning in MBD analytics and discusses a scalable learning framework over Apache Spark. Specifically, a distributed deep learning is executed as an iterative MapReduce computing on many Spark workers. Each Spark worker learns a partial deep model on a partition of the overall MBD, and a master deep model is then built by averaging the parameters of all partial models. This Spark-based framework speeds up the learning of deep models consisting of many hidden layers and millions of parameters. We use a context-aware activity recognition application with a real-world dataset containing millions of samples to validate our framework and assess its speedup effectiveness.
Tasks Activity Recognition
Published 2016-02-23
URL http://arxiv.org/abs/1602.07031v1
PDF http://arxiv.org/pdf/1602.07031v1.pdf
PWC https://paperswithcode.com/paper/mobile-big-data-analytics-using-deep-learning
Repo
Framework

Enabling Basic Normative HRI in a Cognitive Robotic Architecture

Title Enabling Basic Normative HRI in a Cognitive Robotic Architecture
Authors Vasanth Sarathy, Jason R. Wilson, Thomas Arnold, Matthias Scheutz
Abstract Collaborative human activities are grounded in social and moral norms, which humans consciously and subconsciously use to guide and constrain their decision-making and behavior, thereby strengthening their interactions and preventing emotional and physical harm. This type of norm-based processing is also critical for robots in many human-robot interaction scenarios (e.g., when helping elderly and disabled persons in assisted living facilities, or assisting humans in assembly tasks in factories or even the space station). In this position paper, we will briefly describe how several components in an integrated cognitive architecture can be used to implement processes that are required for normative human-robot interactions, especially in collaborative tasks where actions and situations could potentially be perceived as threatening and thus need a change in course of action to mitigate the perceived threats.
Tasks Decision Making
Published 2016-02-11
URL http://arxiv.org/abs/1602.03814v1
PDF http://arxiv.org/pdf/1602.03814v1.pdf
PWC https://paperswithcode.com/paper/enabling-basic-normative-hri-in-a-cognitive
Repo
Framework

A nonparametric HMM for genetic imputation and coalescent inference

Title A nonparametric HMM for genetic imputation and coalescent inference
Authors Lloyd T. Elliott, Yee Whye Teh
Abstract Genetic sequence data are well described by hidden Markov models (HMMs) in which latent states correspond to clusters of similar mutation patterns. Theory from statistical genetics suggests that these HMMs are nonhomogeneous (their transition probabilities vary along the chromosome) and have large support for self transitions. We develop a new nonparametric model of genetic sequence data, based on the hierarchical Dirichlet process, which supports these self transitions and nonhomogeneity. Our model provides a parameterization of the genetic process that is more parsimonious than other more general nonparametric models which have previously been applied to population genetics. We provide truncation-free MCMC inference for our model using a new auxiliary sampling scheme for Bayesian nonparametric HMMs. In a series of experiments on male X chromosome data from the Thousand Genomes Project and also on data simulated from a population bottleneck we show the benefits of our model over the popular finite model fastPHASE, which can itself be seen as a parametric truncation of our model. We find that the number of HMM states found by our model is correlated with the time to the most recent common ancestor in population bottlenecks. This work demonstrates the flexibility of Bayesian nonparametrics applied to large and complex genetic data.
Tasks Imputation
Published 2016-11-02
URL http://arxiv.org/abs/1611.00544v1
PDF http://arxiv.org/pdf/1611.00544v1.pdf
PWC https://paperswithcode.com/paper/a-nonparametric-hmm-for-genetic-imputation
Repo
Framework

Cohesion and Coalition Formation in the European Parliament: Roll-Call Votes and Twitter Activities

Title Cohesion and Coalition Formation in the European Parliament: Roll-Call Votes and Twitter Activities
Authors Darko Cherepnalkoski, Andreas Karpf, Igor Mozetic, Miha Grcar
Abstract We study the cohesion within and the coalitions between political groups in the Eighth European Parliament (2014–2019) by analyzing two entirely different aspects of the behavior of the Members of the European Parliament (MEPs) in the policy-making processes. On one hand, we analyze their co-voting patterns and, on the other, their retweeting behavior. We make use of two diverse datasets in the analysis. The first one is the roll-call vote dataset, where cohesion is regarded as the tendency to co-vote within a group, and a coalition is formed when the members of several groups exhibit a high degree of co-voting agreement on a subject. The second dataset comes from Twitter; it captures the retweeting (i.e., endorsing) behavior of the MEPs and implies cohesion (retweets within the same group) and coalitions (retweets between groups) from a completely different perspective. We employ two different methodologies to analyze the cohesion and coalitions. The first one is based on Krippendorff’s Alpha reliability, used to measure the agreement between raters in data-analysis scenarios, and the second one is based on Exponential Random Graph Models, often used in social-network analysis. We give general insights into the cohesion of political groups in the European Parliament, explore whether coalitions are formed in the same way for different policy areas, and examine to what degree the retweeting behavior of MEPs corresponds to their co-voting patterns. A novel and interesting aspect of our work is the relationship between the co-voting and retweeting patterns.
Tasks
Published 2016-08-17
URL http://arxiv.org/abs/1608.04917v2
PDF http://arxiv.org/pdf/1608.04917v2.pdf
PWC https://paperswithcode.com/paper/cohesion-and-coalition-formation-in-the
Repo
Framework

Temporal Matrix Completion with Locally Linear Latent Factors for Medical Applications

Title Temporal Matrix Completion with Locally Linear Latent Factors for Medical Applications
Authors Frodo Kin Sun Chan, Andy J Ma, Pong C Yuen, Terry Cheuk-Fung Yip, Yee-Kit Tse, Vincent Wai-Sun Wong, Grace Lai-Hung Wong
Abstract Regular medical records are useful for medical practitioners to analyze and monitor patient health status especially for those with chronic disease, but such records are usually incomplete due to unpunctuality and absence of patients. In order to resolve the missing data problem over time, tensor-based model is suggested for missing data imputation in recent papers because this approach makes use of low rank tensor assumption for highly correlated data. However, when the time intervals between records are long, the data correlation is not high along temporal direction and such assumption is not valid. To address this problem, we propose to decompose a matrix with missing data into its latent factors. Then, the locally linear constraint is imposed on these factors for matrix completion in this paper. By using a publicly available dataset and two medical datasets collected from hospital, experimental results show that the proposed algorithm achieves the best performance by comparing with the existing methods.
Tasks Imputation, Matrix Completion
Published 2016-10-31
URL http://arxiv.org/abs/1611.00800v1
PDF http://arxiv.org/pdf/1611.00800v1.pdf
PWC https://paperswithcode.com/paper/temporal-matrix-completion-with-locally
Repo
Framework

On Bayesian index policies for sequential resource allocation

Title On Bayesian index policies for sequential resource allocation
Authors Emilie Kaufmann
Abstract This paper is about index policies for minimizing (frequentist) regret in a stochastic multi-armed bandit model, inspired by a Bayesian view on the problem. Our main contribution is to prove that the Bayes-UCB algorithm, which relies on quantiles of posterior distributions, is asymptotically optimal when the reward distributions belong to a one-dimensional exponential family, for a large class of prior distributions. We also show that the Bayesian literature gives new insight on what kind of exploration rates could be used in frequentist, UCB-type algorithms. Indeed, approximations of the Bayesian optimal solution or the Finite Horizon Gittins indices provide a justification for the kl-UCB+ and kl-UCB-H+ algorithms, whose asymptotic optimality is also established.
Tasks
Published 2016-01-06
URL http://arxiv.org/abs/1601.01190v3
PDF http://arxiv.org/pdf/1601.01190v3.pdf
PWC https://paperswithcode.com/paper/on-bayesian-index-policies-for-sequential
Repo
Framework

DP-EM: Differentially Private Expectation Maximization

Title DP-EM: Differentially Private Expectation Maximization
Authors Mijung Park, Jimmy Foulds, Kamalika Chaudhuri, Max Welling
Abstract The iterative nature of the expectation maximization (EM) algorithm presents a challenge for privacy-preserving estimation, as each iteration increases the amount of noise needed. We propose a practical private EM algorithm that overcomes this challenge using two innovations: (1) a novel moment perturbation formulation for differentially private EM (DP-EM), and (2) the use of two recently developed composition methods to bound the privacy “cost” of multiple EM iterations: the moments accountant (MA) and zero-mean concentrated differential privacy (zCDP). Both MA and zCDP bound the moment generating function of the privacy loss random variable and achieve a refined tail bound, which effectively decrease the amount of additive noise. We present empirical results showing the benefits of our approach, as well as similar performance between these two composition methods in the DP-EM setting for Gaussian mixture models. Our approach can be readily extended to many iterative learning algorithms, opening up various exciting future directions.
Tasks
Published 2016-05-23
URL http://arxiv.org/abs/1605.06995v2
PDF http://arxiv.org/pdf/1605.06995v2.pdf
PWC https://paperswithcode.com/paper/dp-em-differentially-private-expectation
Repo
Framework

Incremental Reconstruction of Urban Environments by Edge-Points Delaunay Triangulation

Title Incremental Reconstruction of Urban Environments by Edge-Points Delaunay Triangulation
Authors Andrea Romanoni, Matteo Matteucci
Abstract Urban reconstruction from a video captured by a surveying vehicle constitutes a core module of automated mapping. When computational power represents a limited resource and, a detailed map is not the primary goal, the reconstruction can be performed incrementally, from a monocular video, carving a 3D Delaunay triangulation of sparse points; this allows online incremental mapping for tasks such as traversability analysis or obstacle avoidance. To exploit the sharp edges of urban landscape, we propose to use a Delaunay triangulation of Edge-Points, which are the 3D points corresponding to image edges. These points constrain the edges of the 3D Delaunay triangulation to real-world edges. Besides the use of the Edge-Points, a second contribution of this paper is the Inverse Cone Heuristic that preemptively avoids the creation of artifacts in the reconstructed manifold surface. We force the reconstruction of a manifold surface since it makes it possible to apply computer graphics or photometric refinement algorithms to the output mesh. We evaluated our approach on four real sequences of the public available KITTI dataset by comparing the incremental reconstruction against Velodyne measurements.
Tasks
Published 2016-04-21
URL http://arxiv.org/abs/1604.06232v2
PDF http://arxiv.org/pdf/1604.06232v2.pdf
PWC https://paperswithcode.com/paper/incremental-reconstruction-of-urban
Repo
Framework

Sorting out symptoms: design and evaluation of the ‘babylon check’ automated triage system

Title Sorting out symptoms: design and evaluation of the ‘babylon check’ automated triage system
Authors Katherine Middleton, Mobasher Butt, Nils Hammerla, Steven Hamblin, Karan Mehta, Ali Parsa
Abstract Prior to seeking professional medical care it is increasingly common for patients to use online resources such as automated symptom checkers. Many such systems attempt to provide a differential diagnosis based on the symptoms elucidated from the user, which may lead to anxiety if life or limb-threatening conditions are part of the list, a phenomenon termed ‘cyberchondria’ [1]. Systems that provide advice on where to seek help, rather than a diagnosis, are equally popular, and in our view provide the most useful information. In this technical report we describe how such a triage system can be modelled computationally, how medical insights can be translated into triage flows, and how such systems can be validated and tested. We present babylon check, our commercially deployed automated triage system, as a case study, and illustrate its performance in a large, semi-naturalistic deployment study.
Tasks
Published 2016-06-07
URL http://arxiv.org/abs/1606.02041v1
PDF http://arxiv.org/pdf/1606.02041v1.pdf
PWC https://paperswithcode.com/paper/sorting-out-symptoms-design-and-evaluation-of
Repo
Framework

A Selection of Giant Radio Sources from NVSS

Title A Selection of Giant Radio Sources from NVSS
Authors D. D. Proctor
Abstract Results of the application of pattern recognition techniques to the problem of identifying Giant Radio Sources (GRS) from the data in the NVSS catalog are presented and issues affecting the process are explored. Decision-tree pattern recognition software was applied to training set source pairs developed from known NVSS large angular size radio galaxies. The full training set consisted of 51,195 source pairs, 48 of which were known GRS for which each lobe was primarily represented by a single catalog component. The source pairs had a maximum separation of 20 arc minutes and a minimum component area of 1.87 square arc minutes at the 1.4 mJy level. The importance of comparing resulting probability distributions of the training and application sets for cases of unknown class ratio is demonstrated. The probability of correctly ranking a randomly selected (GRS, non-GRS) pair from the best of the tested classifiers was determined to be 97.8 +/- 1.5%. The best classifiers were applied to the over 870,000 candidate pairs from the entire catalog. Images of higher ranked sources were visually screened and a table of over sixteen hundred candidates, including morphological annotation, is presented. These systems include doubles and triples, Wide-Angle Tail (WAT) and Narrow-Angle Tail (NAT), S- or Z-shaped systems, and core-jets and resolved cores. While some resolved lobe systems are recovered with this technique, generally it is expected that such systems would require a different approach.
Tasks
Published 2016-03-22
URL http://arxiv.org/abs/1603.06895v2
PDF http://arxiv.org/pdf/1603.06895v2.pdf
PWC https://paperswithcode.com/paper/a-selection-of-giant-radio-sources-from-nvss
Repo
Framework

ATGV-Net: Accurate Depth Super-Resolution

Title ATGV-Net: Accurate Depth Super-Resolution
Authors Gernot Riegler, Matthias Rüther, Horst Bischof
Abstract In this work we present a novel approach for single depth map super-resolution. Modern consumer depth sensors, especially Time-of-Flight sensors, produce dense depth measurements, but are affected by noise and have a low lateral resolution. We propose a method that combines the benefits of recent advances in machine learning based single image super-resolution, i.e. deep convolutional networks, with a variational method to recover accurate high-resolution depth maps. In particular, we integrate a variational method that models the piecewise affine structures apparent in depth data via an anisotropic total generalized variation regularization term on top of a deep network. We call our method ATGV-Net and train it end-to-end by unrolling the optimization procedure of the variational method. To train deep networks, a large corpus of training data with accurate ground-truth is required. We demonstrate that it is feasible to train our method solely on synthetic data that we generate in large quantities for this task. Our evaluations show that we achieve state-of-the-art results on three different benchmarks, as well as on a challenging Time-of-Flight dataset, all without utilizing an additional intensity image as guidance.
Tasks Depth Map Super-Resolution, Image Super-Resolution, Super-Resolution
Published 2016-07-27
URL http://arxiv.org/abs/1607.07988v1
PDF http://arxiv.org/pdf/1607.07988v1.pdf
PWC https://paperswithcode.com/paper/atgv-net-accurate-depth-super-resolution
Repo
Framework
comments powered by Disqus