May 5, 2019

2678 words 13 mins read

Paper Group ANR 490

Paper Group ANR 490

Computational Mapping of the Ground Reflectivity with Laser Scanners. Artificial Neural Networks for Detection of Malaria in RBCs. Pattern recognition on the quantum Bloch sphere. Semi-Supervised Learning with the Deep Rendering Mixture Model. Density Estimation with Distribution Element Trees. Large Margin Nearest Neighbor Classification using Cur …

Computational Mapping of the Ground Reflectivity with Laser Scanners

Title Computational Mapping of the Ground Reflectivity with Laser Scanners
Authors Juan Castorena
Abstract In this investigation we focus on the problem of mapping the ground reflectivity with multiple laser scanners mounted on mobile robots/vehicles. The problem originates because regions of the ground become populated with a varying number of reflectivity measurements whose value depends on the observer and its corresponding perspective. Here, we propose a novel automatic, data-driven computational mapping framework specifically aimed at preserving edge sharpness in the map reconstruction process and that considers the sources of measurement variation. Our new formulation generates map-perspective gradients and applies sub-set selection fusion and de-noising operators to these through iterative algorithms that minimize an $\ell_1$ sparse regularized least squares formulation. Reconstruction of the ground reflectivity is then carried out based on Poisson’s formulation posed as an $\ell_2$ term promoting consistency with the fused gradient of map-perspectives and a term that ensures equality constraints with reference measurement data. We demonstrate our new framework outperforms the capabilities of existing ones with experiments realized on Ford’s fleet of autonomous vehicles. For example, we show we can achieve map enhancement (i.e., contrast enhancement), artifact removal, de-noising and map-stitching without requiring an additional reflectivity adjustment to calibrate sensors to the specific mounting and robot/vehicle motion.
Tasks Autonomous Vehicles
Published 2016-11-28
URL http://arxiv.org/abs/1611.09203v2
PDF http://arxiv.org/pdf/1611.09203v2.pdf
PWC https://paperswithcode.com/paper/computational-mapping-of-the-ground
Repo
Framework

Artificial Neural Networks for Detection of Malaria in RBCs

Title Artificial Neural Networks for Detection of Malaria in RBCs
Authors Purnima Pandit, A. Anand
Abstract Malaria is one of the most common diseases caused by mosquitoes and is a great public health problem worldwide. Currently, for malaria diagnosis the standard technique is microscopic examination of a stained blood film. We propose use of Artificial Neural Networks (ANN) for the diagnosis of the disease in the red blood cell. For this purpose features / parameters are computed from the data obtained by the digital holographic images of the blood cells and is given as input to ANN which classifies the cell as the infected one or otherwise.
Tasks
Published 2016-08-23
URL http://arxiv.org/abs/1608.06627v1
PDF http://arxiv.org/pdf/1608.06627v1.pdf
PWC https://paperswithcode.com/paper/artificial-neural-networks-for-detection-of
Repo
Framework

Pattern recognition on the quantum Bloch sphere

Title Pattern recognition on the quantum Bloch sphere
Authors Giuseppe Sergioli, Enrica Santucci, Luca Didaci, Jaroslaw A. Miszczak, Roberto Giuntini
Abstract We introduce a framework suitable for describing pattern recognition task using the mathematical language of density matrices. In particular, we provide a one-to-one correspondence between patterns and pure density operators. This correspondence enables us to: i) represent the Nearest Mean Classifier (NMC) in terms of quantum objects, ii) introduce a Quantum Classifier (QC). By comparing the QC with the NMC on different 2D datasets, we show the first classifier can provide additional information that are particularly beneficial on a classical computer with respect to the second classifier.
Tasks
Published 2016-03-01
URL http://arxiv.org/abs/1603.00173v2
PDF http://arxiv.org/pdf/1603.00173v2.pdf
PWC https://paperswithcode.com/paper/pattern-recognition-on-the-quantum-bloch
Repo
Framework

Semi-Supervised Learning with the Deep Rendering Mixture Model

Title Semi-Supervised Learning with the Deep Rendering Mixture Model
Authors Tan Nguyen, Wanjia Liu, Ethan Perez, Richard G. Baraniuk, Ankit B. Patel
Abstract Semi-supervised learning algorithms reduce the high cost of acquiring labeled training data by using both labeled and unlabeled data during learning. Deep Convolutional Networks (DCNs) have achieved great success in supervised tasks and as such have been widely employed in the semi-supervised learning. In this paper we leverage the recently developed Deep Rendering Mixture Model (DRMM), a probabilistic generative model that models latent nuisance variation, and whose inference algorithm yields DCNs. We develop an EM algorithm for the DRMM to learn from both labeled and unlabeled data. Guided by the theory of the DRMM, we introduce a novel non-negativity constraint and a variational inference term. We report state-of-the-art performance on MNIST and SVHN and competitive results on CIFAR10. We also probe deeper into how a DRMM trained in a semi-supervised setting represents latent nuisance variation using synthetically rendered images. Taken together, our work provides a unified framework for supervised, unsupervised, and semi-supervised learning.
Tasks
Published 2016-12-06
URL http://arxiv.org/abs/1612.01942v1
PDF http://arxiv.org/pdf/1612.01942v1.pdf
PWC https://paperswithcode.com/paper/semi-supervised-learning-with-the-deep
Repo
Framework

Density Estimation with Distribution Element Trees

Title Density Estimation with Distribution Element Trees
Authors Daniel W. Meyer
Abstract The estimation of probability densities based on available data is a central task in many statistical applications. Especially in the case of large ensembles with many samples or high-dimensional sample spaces, computationally efficient methods are needed. We propose a new method that is based on a decomposition of the unknown distribution in terms of so-called distribution elements (DEs). These elements enable an adaptive and hierarchical discretization of the sample space with small or large elements in regions with smoothly or highly variable densities, respectively. The novel refinement strategy that we propose is based on statistical goodness-of-fit and pair-wise (as an approximation to mutual) independence tests that evaluate the local approximation of the distribution in terms of DEs. The capabilities of our new method are inspected based on several examples of different dimensionality and successfully compared with other state-of-the-art density estimators.
Tasks Density Estimation
Published 2016-10-02
URL http://arxiv.org/abs/1610.00345v2
PDF http://arxiv.org/pdf/1610.00345v2.pdf
PWC https://paperswithcode.com/paper/density-estimation-with-distribution-element
Repo
Framework

Large Margin Nearest Neighbor Classification using Curved Mahalanobis Distances

Title Large Margin Nearest Neighbor Classification using Curved Mahalanobis Distances
Authors Frank Nielsen, Boris Muzellec, Richard Nock
Abstract We consider the supervised classification problem of machine learning in Cayley-Klein projective geometries: We show how to learn a curved Mahalanobis metric distance corresponding to either the hyperbolic geometry or the elliptic geometry using the Large Margin Nearest Neighbor (LMNN) framework. We report on our experimental results, and further consider the case of learning a mixed curved Mahalanobis distance. Besides, we show that the Cayley-Klein Voronoi diagrams are affine, and can be built from an equivalent (clipped) power diagrams, and that Cayley-Klein balls have Mahalanobis shapes with displaced centers.
Tasks
Published 2016-09-22
URL http://arxiv.org/abs/1609.07082v2
PDF http://arxiv.org/pdf/1609.07082v2.pdf
PWC https://paperswithcode.com/paper/large-margin-nearest-neighbor-classification
Repo
Framework

Gray-box inference for structured Gaussian process models

Title Gray-box inference for structured Gaussian process models
Authors Pietro Galliani, Amir Dezfouli, Edwin V. Bonilla, Novi Quadrianto
Abstract We develop an automated variational inference method for Bayesian structured prediction problems with Gaussian process (GP) priors and linear-chain likelihoods. Our approach does not need to know the details of the structured likelihood model and can scale up to a large number of observations. Furthermore, we show that the required expected likelihood term and its gradients in the variational objective (ELBO) can be estimated efficiently by using expectations over very low-dimensional Gaussian distributions. Optimization of the ELBO is fully parallelizable over sequences and amenable to stochastic optimization, which we use along with control variate techniques and state-of-the-art incremental optimization to make our framework useful in practice. Results on a set of natural language processing tasks show that our method can be as good as (and sometimes better than) hard-coded approaches including SVM-struct and CRFs, and overcomes the scalability limitations of previous inference algorithms based on sampling. Overall, this is a fundamental step to developing automated inference methods for Bayesian structured prediction.
Tasks Stochastic Optimization, Structured Prediction
Published 2016-09-14
URL http://arxiv.org/abs/1609.04289v1
PDF http://arxiv.org/pdf/1609.04289v1.pdf
PWC https://paperswithcode.com/paper/gray-box-inference-for-structured-gaussian
Repo
Framework

Convex Decomposition And Efficient Shape Representation Using Deformable Convex Polytopes

Title Convex Decomposition And Efficient Shape Representation Using Deformable Convex Polytopes
Authors Fitsum Mesadi, Tolga Tasdizen
Abstract Decomposition of shapes into (approximate) convex parts is essential for applications such as part-based shape representation, shape matching, and collision detection. In this paper, we propose a novel convex decomposition using a parametric implicit shape model called Disjunctive Normal Shape Model (DNSM). The DNSM is formed as a union of polytopes which themselves are formed by intersections of halfspaces. The key idea is by deforming the polytopes, which naturally remain convex during the evolution, the polytopes capture convex parts without the need to compute convexity. The major contributions of this paper include a robust convex decomposition which also results in an efficient part-based shape representation, and a novel shape convexity measure. The experimental results show the potential of the proposed method.
Tasks
Published 2016-06-23
URL http://arxiv.org/abs/1606.07509v1
PDF http://arxiv.org/pdf/1606.07509v1.pdf
PWC https://paperswithcode.com/paper/convex-decomposition-and-efficient-shape
Repo
Framework

Calorie Counter: RGB-Depth Visual Estimation of Energy Expenditure at Home

Title Calorie Counter: RGB-Depth Visual Estimation of Energy Expenditure at Home
Authors Lili Tao, Tilo Burghardt, Majid Mirmehdi, Dima Damen, Ashley Cooper, Sion Hannuna, Massimo Camplani, Adeline Paiement, Ian Craddock
Abstract We present a new framework for vision-based estimation of calorific expenditure from RGB-D data - the first that is validated on physical gas exchange measurements and applied to daily living scenarios. Deriving a person’s energy expenditure from sensors is an important tool in tracking physical activity levels for health and lifestyle monitoring. Most existing methods use metabolic lookup tables (METs) for a manual estimate or systems with inertial sensors which ultimately require users to wear devices. In contrast, the proposed pose-invariant and individual-independent vision framework allows for a remote estimation of calorific expenditure. We introduce, and evaluate our approach on, a new dataset called SPHERE-calorie, for which visual estimates can be compared against simultaneously obtained, indirect calorimetry measures based on gas exchange. % based on per breath gas exchange. We conclude from our experiments that the proposed vision pipeline is suitable for home monitoring in a controlled environment, with calorific expenditure estimates above accuracy levels of commonly used manual estimations via METs. With the dataset released, our work establishes a baseline for future research for this little-explored area of computer vision.
Tasks
Published 2016-07-27
URL http://arxiv.org/abs/1607.08196v1
PDF http://arxiv.org/pdf/1607.08196v1.pdf
PWC https://paperswithcode.com/paper/calorie-counter-rgb-depth-visual-estimation
Repo
Framework

Decision Aids for Adversarial Planning in Military Operations: Algorithms, Tools, and Turing-test-like Experimental Validation

Title Decision Aids for Adversarial Planning in Military Operations: Algorithms, Tools, and Turing-test-like Experimental Validation
Authors Alexander Kott, Ray Budd, Larry Ground, Lakshmi Rebbapragada, John Langston
Abstract Use of intelligent decision aids can help alleviate the challenges of planning complex operations. We describe integrated algorithms, and a tool capable of translating a high-level concept for a tactical military operation into a fully detailed, actionable plan, producing automatically (or with human guidance) plans with realistic degree of detail and of human-like quality. Tight interleaving of several algorithms – planning, adversary estimates, scheduling, routing, attrition and consumption estimates – comprise the computational approach of this tool. Although originally developed for Army large-unit operations, the technology is generic and also applies to a number of other domains, particularly in critical situations requiring detailed planning within a constrained period of time. In this paper, we focus particularly on the engineering tradeoffs in the design of the tool. In an experimental evaluation, reminiscent of the Turing test, the tool’s performance compared favorably with human planners.
Tasks
Published 2016-01-22
URL http://arxiv.org/abs/1601.06108v1
PDF http://arxiv.org/pdf/1601.06108v1.pdf
PWC https://paperswithcode.com/paper/decision-aids-for-adversarial-planning-in
Repo
Framework

Black-box Importance Sampling

Title Black-box Importance Sampling
Authors Qiang Liu, Jason D. Lee
Abstract Importance sampling is widely used in machine learning and statistics, but its power is limited by the restriction of using simple proposals for which the importance weights can be tractably calculated. We address this problem by studying black-box importance sampling methods that calculate importance weights for samples generated from any unknown proposal or black-box mechanism. Our method allows us to use better and richer proposals to solve difficult problems, and (somewhat counter-intuitively) also has the additional benefit of improving the estimation accuracy beyond typical importance sampling. Both theoretical and empirical analyses are provided.
Tasks
Published 2016-10-17
URL http://arxiv.org/abs/1610.05247v1
PDF http://arxiv.org/pdf/1610.05247v1.pdf
PWC https://paperswithcode.com/paper/black-box-importance-sampling
Repo
Framework

Deep Learning Approximation for Stochastic Control Problems

Title Deep Learning Approximation for Stochastic Control Problems
Authors Jiequn Han, Weinan E
Abstract Many real world stochastic control problems suffer from the “curse of dimensionality”. To overcome this difficulty, we develop a deep learning approach that directly solves high-dimensional stochastic control problems based on Monte-Carlo sampling. We approximate the time-dependent controls as feedforward neural networks and stack these networks together through model dynamics. The objective function for the control problem plays the role of the loss function for the deep neural network. We test this approach using examples from the areas of optimal trading and energy storage. Our results suggest that the algorithm presented here achieves satisfactory accuracy and at the same time, can handle rather high dimensional problems.
Tasks
Published 2016-11-02
URL http://arxiv.org/abs/1611.07422v1
PDF http://arxiv.org/pdf/1611.07422v1.pdf
PWC https://paperswithcode.com/paper/deep-learning-approximation-for-stochastic
Repo
Framework

A Hierarchical Distributed Processing Framework for Big Image Data

Title A Hierarchical Distributed Processing Framework for Big Image Data
Authors Le Dong, Zhiyu Lin, Yan Liang, Ling He, Ning Zhang, Qi Chen, Xiaochun Cao, Ebroul lzquierdo
Abstract This paper introduces an effective processing framework nominated ICP (Image Cloud Processing) to powerfully cope with the data explosion in image processing field. While most previous researches focus on optimizing the image processing algorithms to gain higher efficiency, our work dedicates to providing a general framework for those image processing algorithms, which can be implemented in parallel so as to achieve a boost in time efficiency without compromising the results performance along with the increasing image scale. The proposed ICP framework consists of two mechanisms, i.e. SICP (Static ICP) and DICP (Dynamic ICP). Specifically, SICP is aimed at processing the big image data pre-stored in the distributed system, while DICP is proposed for dynamic input. To accomplish SICP, two novel data representations named P-Image and Big-Image are designed to cooperate with MapReduce to achieve more optimized configuration and higher efficiency. DICP is implemented through a parallel processing procedure working with the traditional processing mechanism of the distributed system. Representative results of comprehensive experiments on the challenging ImageNet dataset are selected to validate the capacity of our proposed ICP framework over the traditional state-of-the-art methods, both in time efficiency and quality of results.
Tasks
Published 2016-07-03
URL http://arxiv.org/abs/1607.00577v1
PDF http://arxiv.org/pdf/1607.00577v1.pdf
PWC https://paperswithcode.com/paper/a-hierarchical-distributed-processing
Repo
Framework

Sketching Meets Random Projection in the Dual: A Provable Recovery Algorithm for Big and High-dimensional Data

Title Sketching Meets Random Projection in the Dual: A Provable Recovery Algorithm for Big and High-dimensional Data
Authors Jialei Wang, Jason D. Lee, Mehrdad Mahdavi, Mladen Kolar, Nathan Srebro
Abstract Sketching techniques have become popular for scaling up machine learning algorithms by reducing the sample size or dimensionality of massive data sets, while still maintaining the statistical power of big data. In this paper, we study sketching from an optimization point of view: we first show that the iterative Hessian sketch is an optimization process with preconditioning, and develop accelerated iterative Hessian sketch via the searching the conjugate direction; we then establish primal-dual connections between the Hessian sketch and dual random projection, and apply the preconditioned conjugate gradient approach on the dual problem, which leads to the accelerated iterative dual random projection methods. Finally to tackle the challenges from both large sample size and high-dimensionality, we propose the primal-dual sketch, which iteratively sketches the primal and dual formulations. We show that using a logarithmic number of calls to solvers of small scale problem, primal-dual sketch is able to recover the optimum of the original problem up to arbitrary precision. The proposed algorithms are validated via extensive experiments on synthetic and real data sets which complements our theoretical results.
Tasks
Published 2016-10-10
URL http://arxiv.org/abs/1610.03045v1
PDF http://arxiv.org/pdf/1610.03045v1.pdf
PWC https://paperswithcode.com/paper/sketching-meets-random-projection-in-the-dual
Repo
Framework

Biclustering Readings and Manuscripts via Non-negative Matrix Factorization, with Application to the Text of Jude

Title Biclustering Readings and Manuscripts via Non-negative Matrix Factorization, with Application to the Text of Jude
Authors Joey McCollum, Stephen Brown
Abstract The text-critical practice of grouping witnesses into families or texttypes often faces two obstacles: Contamination in the manuscript tradition, and co-dependence in identifying characteristic readings and manuscripts. We introduce non-negative matrix factorization (NMF) as a simple, unsupervised, and efficient way to cluster large numbers of manuscripts and readings simultaneously while summarizing contamination using an easy-to-interpret mixture model. We apply this method to an extensive collation of the New Testament epistle of Jude and show that the resulting clusters correspond to human-identified textual families from existing research.
Tasks
Published 2016-02-03
URL http://arxiv.org/abs/1602.01323v1
PDF http://arxiv.org/pdf/1602.01323v1.pdf
PWC https://paperswithcode.com/paper/biclustering-readings-and-manuscripts-via-non
Repo
Framework
comments powered by Disqus