May 6, 2019

2971 words 14 mins read

Paper Group ANR 161

Paper Group ANR 161

‘Part’ly first among equals: Semantic part-based benchmarking for state-of-the-art object recognition systems. Efficient adaptation of complex-valued noiselet sensing matrices for compressed single-pixel imaging. Optimising The Input Window Alignment in CD-DNN Based Phoneme Recognition for Low Latency Processing. Shallow Parsing Pipeline for Hindi- …

‘Part’ly first among equals: Semantic part-based benchmarking for state-of-the-art object recognition systems

Title ‘Part’ly first among equals: Semantic part-based benchmarking for state-of-the-art object recognition systems
Authors Ravi Kiran Sarvadevabhatla, Shanthakumar Venkatraman, R. Venkatesh Babu
Abstract An examination of object recognition challenge leaderboards (ILSVRC, PASCAL-VOC) reveals that the top-performing classifiers typically exhibit small differences amongst themselves in terms of error rate/mAP. To better differentiate the top performers, additional criteria are required. Moreover, the (test) images, on which the performance scores are based, predominantly contain fully visible objects. Therefore, `harder’ test images, mimicking the challenging conditions (e.g. occlusion) in which humans routinely recognize objects, need to be utilized for benchmarking. To address the concerns mentioned above, we make two contributions. First, we systematically vary the level of local object-part content, global detail and spatial context in images from PASCAL VOC 2010 to create a new benchmarking dataset dubbed PPSS-12. Second, we propose an object-part based benchmarking procedure which quantifies classifiers’ robustness to a range of visibility and contextual settings. The benchmarking procedure relies on a semantic similarity measure that naturally addresses potential semantic granularity differences between the category labels in training and test datasets, thus eliminating manual mapping. We use our procedure on the PPSS-12 dataset to benchmark top-performing classifiers trained on the ILSVRC-2012 dataset. Our results show that the proposed benchmarking procedure enables additional differentiation among state-of-the-art object classifiers in terms of their ability to handle missing content and insufficient object detail. Given this capability for additional differentiation, our approach can potentially supplement existing benchmarking procedures used in object recognition challenge leaderboards. |
Tasks Object Recognition, Semantic Similarity, Semantic Textual Similarity
Published 2016-11-23
URL http://arxiv.org/abs/1611.07703v2
PDF http://arxiv.org/pdf/1611.07703v2.pdf
PWC https://paperswithcode.com/paper/partly-first-among-equals-semantic-part-based
Repo
Framework

Efficient adaptation of complex-valued noiselet sensing matrices for compressed single-pixel imaging

Title Efficient adaptation of complex-valued noiselet sensing matrices for compressed single-pixel imaging
Authors Anna Pastuszczak, Bartłomiej Szczygieł, Michał Mikołajczyk, Rafał Kotyński
Abstract Minimal mutual coherence of discrete noiselets and Haar wavelets makes this pair of bases an essential choice for the measurement and compression matrices in compressed-sensing-based single-pixel detectors. In this paper we propose an efficient way of using complex-valued and non-binary noiselet functions for object sampling in single-pixel cameras with binary spatial light modulators and incoherent illumination. The proposed method allows to determine m complex noiselet coefficients from m+1 binary sampling measurements. Further, we introduce a modification to the complex fast noiselet transform, which enables computationally-efficient real-time generation of the binary noiselet-based patterns using efficient integer calculations on bundled patterns. The proposed method is verified experimentally with a single-pixel camera system using a binary spatial light modulator.
Tasks
Published 2016-06-14
URL http://arxiv.org/abs/1606.04535v1
PDF http://arxiv.org/pdf/1606.04535v1.pdf
PWC https://paperswithcode.com/paper/efficient-adaptation-of-complex-valued
Repo
Framework

Optimising The Input Window Alignment in CD-DNN Based Phoneme Recognition for Low Latency Processing

Title Optimising The Input Window Alignment in CD-DNN Based Phoneme Recognition for Low Latency Processing
Authors Akash Kumar Dhaka, Giampiero Salvi
Abstract We present a systematic analysis on the performance of a phonetic recogniser when the window of input features is not symmetric with respect to the current frame. The recogniser is based on Context Dependent Deep Neural Networks (CD-DNNs) and Hidden Markov Models (HMMs). The objective is to reduce the latency of the system by reducing the number of future feature frames required to estimate the current output. Our tests performed on the TIMIT database show that the performance does not degrade when the input window is shifted up to 5 frames in the past compared to common practice (no future frame). This corresponds to improving the latency by 50 ms in our settings. Our tests also show that the best results are not obtained with the symmetric window commonly employed, but with an asymmetric window with eight past and two future context frames, although this observation should be confirmed on other data sets. The reduction in latency suggested by our results is critical for specific applications such as real-time lip synchronisation for tele-presence, but may also be beneficial in general applications to improve the lag in human-machine spoken interaction.
Tasks
Published 2016-06-29
URL http://arxiv.org/abs/1606.09163v1
PDF http://arxiv.org/pdf/1606.09163v1.pdf
PWC https://paperswithcode.com/paper/optimising-the-input-window-alignment-in-cd
Repo
Framework

Shallow Parsing Pipeline for Hindi-English Code-Mixed Social Media Text

Title Shallow Parsing Pipeline for Hindi-English Code-Mixed Social Media Text
Authors Arnav Sharma, Sakshi Gupta, Raveesh Motlani, Piyush Bansal, Manish Srivastava, Radhika Mamidi, Dipti M. Sharma
Abstract In this study, the problem of shallow parsing of Hindi-English code-mixed social media text (CSMT) has been addressed. We have annotated the data, developed a language identifier, a normalizer, a part-of-speech tagger and a shallow parser. To the best of our knowledge, we are the first to attempt shallow parsing on CSMT. The pipeline developed has been made available to the research community with the goal of enabling better text analysis of Hindi English CSMT. The pipeline is accessible at http://bit.ly/csmt-parser-api .
Tasks
Published 2016-04-11
URL http://arxiv.org/abs/1604.03136v1
PDF http://arxiv.org/pdf/1604.03136v1.pdf
PWC https://paperswithcode.com/paper/shallow-parsing-pipeline-for-hindi-english
Repo
Framework

Online Optimization Methods for the Quantification Problem

Title Online Optimization Methods for the Quantification Problem
Authors Purushottam Kar, Shuai Li, Harikrishna Narasimhan, Sanjay Chawla, Fabrizio Sebastiani
Abstract The estimation of class prevalence, i.e., the fraction of a population that belongs to a certain class, is a very useful tool in data analytics and learning, and finds applications in many domains such as sentiment analysis, epidemiology, etc. For example, in sentiment analysis, the objective is often not to estimate whether a specific text conveys a positive or a negative sentiment, but rather estimate the overall distribution of positive and negative sentiments during an event window. A popular way of performing the above task, often dubbed quantification, is to use supervised learning to train a prevalence estimator from labeled data. Contemporary literature cites several performance measures used to measure the success of such prevalence estimators. In this paper we propose the first online stochastic algorithms for directly optimizing these quantification-specific performance measures. We also provide algorithms that optimize hybrid performance measures that seek to balance quantification and classification performance. Our algorithms present a significant advancement in the theory of multivariate optimization and we show, by a rigorous theoretical analysis, that they exhibit optimal convergence. We also report extensive experiments on benchmark and real data sets which demonstrate that our methods significantly outperform existing optimization techniques used for these performance measures.
Tasks Epidemiology, Sentiment Analysis
Published 2016-05-13
URL http://arxiv.org/abs/1605.04135v3
PDF http://arxiv.org/pdf/1605.04135v3.pdf
PWC https://paperswithcode.com/paper/online-optimization-methods-for-the
Repo
Framework

Disease Trajectory Maps

Title Disease Trajectory Maps
Authors Peter Schulam, Raman Arora
Abstract Medical researchers are coming to appreciate that many diseases are in fact complex, heterogeneous syndromes composed of subpopulations that express different variants of a related complication. Time series data extracted from individual electronic health records (EHR) offer an exciting new way to study subtle differences in the way these diseases progress over time. In this paper, we focus on answering two questions that can be asked using these databases of time series. First, we want to understand whether there are individuals with similar disease trajectories and whether there are a small number of degrees of freedom that account for differences in trajectories across the population. Second, we want to understand how important clinical outcomes are associated with disease trajectories. To answer these questions, we propose the Disease Trajectory Map (DTM), a novel probabilistic model that learns low-dimensional representations of sparse and irregularly sampled time series. We propose a stochastic variational inference algorithm for learning the DTM that allows the model to scale to large modern medical datasets. To demonstrate the DTM, we analyze data collected on patients with the complex autoimmune disease, scleroderma. We find that DTM learns meaningful representations of disease trajectories and that the representations are significantly associated with important clinical outcomes.
Tasks Time Series
Published 2016-06-29
URL http://arxiv.org/abs/1606.09184v1
PDF http://arxiv.org/pdf/1606.09184v1.pdf
PWC https://paperswithcode.com/paper/disease-trajectory-maps
Repo
Framework

From Collective Adaptive Systems to Human Centric Computation and Back: Spatial Model Checking for Medical Imaging

Title From Collective Adaptive Systems to Human Centric Computation and Back: Spatial Model Checking for Medical Imaging
Authors Gina Belmonte, Vincenzo Ciancia, Diego Latella, Mieke Massink
Abstract Recent research on formal verification for Collective Adaptive Systems (CAS) pushed advancements in spatial and spatio-temporal model checking, and as a side result provided novel image analysis methodologies, rooted in logical methods for topological spaces. Medical Imaging (MI) is a field where such technologies show potential for ground-breaking innovation. In this position paper, we present a preliminary investigation centred on applications of spatial model checking to MI. The focus is shifted from pure logics to a mixture of logical, statistical and algorithmic approaches, driven by the logical nature intrinsic to the specification of the properties of interest in the field. As a result, novel operators are introduced, that could as well be brought back to the setting of CAS.
Tasks
Published 2016-07-08
URL http://arxiv.org/abs/1607.02235v1
PDF http://arxiv.org/pdf/1607.02235v1.pdf
PWC https://paperswithcode.com/paper/from-collective-adaptive-systems-to-human
Repo
Framework

Gearbox Fault Detection through PSO Exact Wavelet Analysis and SVM Classifier

Title Gearbox Fault Detection through PSO Exact Wavelet Analysis and SVM Classifier
Authors Amir Hosein Zamanian, Abdolreza Ohadi
Abstract Time-frequency methods for vibration-based gearbox faults detection have been considered the most efficient method. Among these methods, continuous wavelet transform (CWT) as one of the best time-frequency method has been used for both stationary and transitory signals. Some deficiencies of CWT are problem of overlapping and distortion ofsignals. In this condition, a large amount of redundant information exists so that it may cause false alarm or misinterpretation of the operator. In this paper a modified method called Exact Wavelet Analysis is used to minimize the effects of overlapping and distortion in case of gearbox faults. To implement exact wavelet analysis, Particle Swarm Optimization (PSO) algorithm has been used for this purpose. This method have been implemented for the acceleration signals from 2D acceleration sensor acquired by Advantech PCI-1710 card from a gearbox test setup in Amirkabir University of Technology. Gearbox has been considered in both healthy and chipped tooth gears conditions. Kernelized Support Vector Machine (SVM) with radial basis functions has used the extracted features from exact wavelet analysis for classification. The efficiency of this classifier is then evaluated with the other signals acquired from the setup test. The results show that in comparison of CWT, PSO Exact Wavelet Transform has better ability in feature extraction in price of more computational effort. In addition, PSO exact wavelet has better speed comparing to Genetic Algorithm (GA) exact wavelet in condition of equal population because of factoring mutation and crossover in PSO algorithm. SVM classifier with the extracted features in gearbox shows very good results and its ability has been proved.
Tasks Fault Detection
Published 2016-05-12
URL http://arxiv.org/abs/1605.04874v1
PDF http://arxiv.org/pdf/1605.04874v1.pdf
PWC https://paperswithcode.com/paper/gearbox-fault-detection-through-pso-exact
Repo
Framework

Improving Policy Gradient by Exploring Under-appreciated Rewards

Title Improving Policy Gradient by Exploring Under-appreciated Rewards
Authors Ofir Nachum, Mohammad Norouzi, Dale Schuurmans
Abstract This paper presents a novel form of policy gradient for model-free reinforcement learning (RL) with improved exploration properties. Current policy-based methods use entropy regularization to encourage undirected exploration of the reward landscape, which is ineffective in high dimensional spaces with sparse rewards. We propose a more directed exploration strategy that promotes exploration of under-appreciated reward regions. An action sequence is considered under-appreciated if its log-probability under the current policy under-estimates its resulting reward. The proposed exploration strategy is easy to implement, requiring small modifications to an implementation of the REINFORCE algorithm. We evaluate the approach on a set of algorithmic tasks that have long challenged RL methods. Our approach reduces hyper-parameter sensitivity and demonstrates significant improvements over baseline methods. Our algorithm successfully solves a benchmark multi-digit addition task and generalizes to long sequences. This is, to our knowledge, the first time that a pure RL method has solved addition using only reward feedback.
Tasks
Published 2016-11-28
URL http://arxiv.org/abs/1611.09321v3
PDF http://arxiv.org/pdf/1611.09321v3.pdf
PWC https://paperswithcode.com/paper/improving-policy-gradient-by-exploring-under
Repo
Framework

Greedy Ants Colony Optimization Strategy for Solving the Curriculum Based University Course Timetabling Problem

Title Greedy Ants Colony Optimization Strategy for Solving the Curriculum Based University Course Timetabling Problem
Authors Patrick Kenekayoro, Godswill Zipamone
Abstract Timetabling is a problem faced in all higher education institutions. The International Timetabling Competition (ITC) has published a dataset that can be used to test the quality of methods used to solve this problem. A number of meta-heuristic approaches have obtained good results when tested on the ITC dataset, however few have used the ant colony optimization technique, particularly on the ITC 2007 curriculum based university course timetabling problem. This study describes an ant system that solves the curriculum based university course timetabling problem and the quality of the algorithm is tested on the ITC 2007 dataset. The ant system was able to find feasible solutions in all instances of the dataset and close to optimal solutions in some instances. The ant system performs better than some published approaches, however results obtained are not as good as those obtained by the best published approaches. This study may be used as a benchmark for ant based algorithms that solve the curriculum based university course timetabling problem.
Tasks
Published 2016-02-16
URL http://arxiv.org/abs/1602.04933v1
PDF http://arxiv.org/pdf/1602.04933v1.pdf
PWC https://paperswithcode.com/paper/greedy-ants-colony-optimization-strategy-for
Repo
Framework

Medical image denoising using convolutional denoising autoencoders

Title Medical image denoising using convolutional denoising autoencoders
Authors Lovedeep Gondara
Abstract Image denoising is an important pre-processing step in medical image analysis. Different algorithms have been proposed in past three decades with varying denoising performances. More recently, having outperformed all conventional methods, deep learning based models have shown a great promise. These methods are however limited for requirement of large training sample size and high computational costs. In this paper we show that using small sample size, denoising autoencoders constructed using convolutional layers can be used for efficient denoising of medical images. Heterogeneous images can be combined to boost sample size for increased denoising performance. Simplest of networks can reconstruct images with corruption levels so high that noise and signal are not differentiable to human eye.
Tasks Denoising, Image Denoising
Published 2016-08-16
URL http://arxiv.org/abs/1608.04667v2
PDF http://arxiv.org/pdf/1608.04667v2.pdf
PWC https://paperswithcode.com/paper/medical-image-denoising-using-convolutional
Repo
Framework

On Fast Bilateral Filtering using Fourier Kernels

Title On Fast Bilateral Filtering using Fourier Kernels
Authors Sanjay Ghosh, Kunal N. Chaudhury
Abstract It was demonstrated in earlier work that, by approximating its range kernel using shiftable functions, the non-linear bilateral filter can be computed using a series of fast convolutions. Previous approaches based on shiftable approximation have, however, been restricted to Gaussian range kernels. In this work, we propose a novel approximation that can be applied to any range kernel, provided it has a pointwise-convergent Fourier series. More specifically, we propose to approximate the Gaussian range kernel of the bilateral filter using a Fourier basis, where the coefficients of the basis are obtained by solving a series of least-squares problems. The coefficients can be efficiently computed using a recursive form of the QR decomposition. By controlling the cardinality of the Fourier basis, we can obtain a good tradeoff between the run-time and the filtering accuracy. In particular, we are able to guarantee sub-pixel accuracy for the overall filtering, which is not provided by most existing methods for fast bilateral filtering. We present simulation results to demonstrate the speed and accuracy of the proposed algorithm.
Tasks
Published 2016-03-26
URL http://arxiv.org/abs/1603.08081v1
PDF http://arxiv.org/pdf/1603.08081v1.pdf
PWC https://paperswithcode.com/paper/on-fast-bilateral-filtering-using-fourier
Repo
Framework

Uniresolution representations of white-matter data from CoCoMac

Title Uniresolution representations of white-matter data from CoCoMac
Authors Raghavendra Singh
Abstract Tracing data as collated by CoCoMac, a seminal neuroinformatics database, is at multiple resolutions – white matter tracts were studied for areas and their subdivisions by different reports. Network theoretic analysis of this multi-resolution data often assumes that the data at various resolutions is equivalent, which may not be correct. In this paper we propose three methods to resolve the multi-resolution issue such that the resultant networks have connectivity data at only one resolution. The different resultant networks are compared in terms of their network analysis metrics and degree distributions.
Tasks
Published 2016-02-19
URL http://arxiv.org/abs/1602.06057v1
PDF http://arxiv.org/pdf/1602.06057v1.pdf
PWC https://paperswithcode.com/paper/uniresolution-representations-of-white-matter
Repo
Framework

Clustering-Based Relational Unsupervised Representation Learning with an Explicit Distributed Representation

Title Clustering-Based Relational Unsupervised Representation Learning with an Explicit Distributed Representation
Authors Sebastijan Dumancic, Hendrik Blockeel
Abstract The goal of unsupervised representation learning is to extract a new representation of data, such that solving many different tasks becomes easier. Existing methods typically focus on vectorized data and offer little support for relational data, which additionally describe relationships among instances. In this work we introduce an approach for relational unsupervised representation learning. Viewing a relational dataset as a hypergraph, new features are obtained by clustering vertices and hyperedges. To find a representation suited for many relational learning tasks, a wide range of similarities between relational objects is considered, e.g. feature and structural similarities. We experimentally evaluate the proposed approach and show that models learned on such latent representations perform better, have lower complexity, and outperform the existing approaches on classification tasks.
Tasks Relational Reasoning, Representation Learning, Unsupervised Representation Learning
Published 2016-06-28
URL http://arxiv.org/abs/1606.08658v3
PDF http://arxiv.org/pdf/1606.08658v3.pdf
PWC https://paperswithcode.com/paper/clustering-based-relational-unsupervised
Repo
Framework

Two Methods For Wild Variational Inference

Title Two Methods For Wild Variational Inference
Authors Qiang Liu, Yihao Feng
Abstract Variational inference provides a powerful tool for approximate probabilistic in- ference on complex, structured models. Typical variational inference methods, however, require to use inference networks with computationally tractable proba- bility density functions. This largely limits the design and implementation of vari- ational inference methods. We consider wild variational inference methods that do not require tractable density functions on the inference networks, and hence can be applied in more challenging cases. As an example of application, we treat stochastic gradient Langevin dynamics (SGLD) as an inference network, and use our methods to automatically adjust the step sizes of SGLD, yielding significant improvement over the hand-designed step size schemes
Tasks
Published 2016-11-30
URL http://arxiv.org/abs/1612.00081v2
PDF http://arxiv.org/pdf/1612.00081v2.pdf
PWC https://paperswithcode.com/paper/two-methods-for-wild-variational-inference
Repo
Framework
comments powered by Disqus