Javascript must be enabled to continue!
AMED: Automatic Mixed-Precision Quantization for Edge Devices
View through CrossRef
Quantized neural networks are well known for reducing the latency, power consumption, and model size without significant harm to the performance. This makes them highly appropriate for systems with limited resources and low power capacity. Mixed-precision quantization offers better utilization of customized hardware that supports arithmetic operations at different bitwidths. Quantization methods either aim to minimize the compression loss given a desired reduction or optimize a dependent variable for a specified property of the model (such as FLOPs or model size); both make the performance inefficient when deployed on specific hardware, but more importantly, quantization methods assume that the loss manifold holds a global minimum for a quantized model that copes with the global minimum of the full precision counterpart. Challenging this assumption, we argue that the optimal minimum changes as the precision changes, and thus, it is better to look at quantization as a random process, placing the foundation for a different approach to quantize neural networks, which, during the training procedure, quantizes the model to a different precision, looks at the bit allocation as a Markov Decision Process, and then, finds an optimal bitwidth allocation for measuring specified behaviors on a specific device via direct signals from the particular hardware architecture. By doing so, we avoid the basic assumption that the loss behaves the same way for a quantized model. Automatic Mixed-Precision Quantization for Edge Devices (dubbed AMED) demonstrates its superiority over current state-of-the-art schemes in terms of the trade-off between neural network accuracy and hardware efficiency, backed by a comprehensive evaluation.
Title: AMED: Automatic Mixed-Precision Quantization for Edge Devices
Description:
Quantized neural networks are well known for reducing the latency, power consumption, and model size without significant harm to the performance.
This makes them highly appropriate for systems with limited resources and low power capacity.
Mixed-precision quantization offers better utilization of customized hardware that supports arithmetic operations at different bitwidths.
Quantization methods either aim to minimize the compression loss given a desired reduction or optimize a dependent variable for a specified property of the model (such as FLOPs or model size); both make the performance inefficient when deployed on specific hardware, but more importantly, quantization methods assume that the loss manifold holds a global minimum for a quantized model that copes with the global minimum of the full precision counterpart.
Challenging this assumption, we argue that the optimal minimum changes as the precision changes, and thus, it is better to look at quantization as a random process, placing the foundation for a different approach to quantize neural networks, which, during the training procedure, quantizes the model to a different precision, looks at the bit allocation as a Markov Decision Process, and then, finds an optimal bitwidth allocation for measuring specified behaviors on a specific device via direct signals from the particular hardware architecture.
By doing so, we avoid the basic assumption that the loss behaves the same way for a quantized model.
Automatic Mixed-Precision Quantization for Edge Devices (dubbed AMED) demonstrates its superiority over current state-of-the-art schemes in terms of the trade-off between neural network accuracy and hardware efficiency, backed by a comprehensive evaluation.
Related Results
Alcohol Mixed With Energy Drinks (AmED): A Perfect Recipe For Disaster. An Analysis Of AmED Consumption And Potential Health Effects
Alcohol Mixed With Energy Drinks (AmED): A Perfect Recipe For Disaster. An Analysis Of AmED Consumption And Potential Health Effects
Introduction and purpose of research
With the surge of popularity of energy drinks, a growing trend of mixing alcohol with energy drinks (AmED) has been observed, particularly amo...
Magic graphs
Magic graphs
DE LA TESIS<br/>Si un graf G admet un etiquetament super edge magic, aleshores G es diu que és un graf super edge màgic. La tesis està principalment enfocada a l'estudi del c...
Shrink and Eliminate: A Study of Post-Training Quantization and Repeated Operations Elimination in RNN Models
Shrink and Eliminate: A Study of Post-Training Quantization and Repeated Operations Elimination in RNN Models
Recurrent neural networks (RNNs) are neural networks (NN) designed for time-series applications. There is a growing interest in running RNNs to support these applications on edge d...
Research on Quantization Parameter Decision Scheme for High Efficiency Video Coding
Research on Quantization Parameter Decision Scheme for High Efficiency Video Coding
High-Efficiency Video Coding (HEVC) is one of the most widely studied coding standards. It still uses the block-based hybrid coding framework of Advanced Video Coding (AVC), and co...
Product of digraphs, (super) edge-magic valences and related problems
Product of digraphs, (super) edge-magic valences and related problems
Discrete Mathematics, and in particular Graph Theory, has gained a lot of popularity during the last 7 decades. Among the many branches in Graph Theory, graph labelings has experim...
Two cases of AMeD syndrome with isochromosome 1q treated with allogeneic stem cell transplantation
Two cases of AMeD syndrome with isochromosome 1q treated with allogeneic stem cell transplantation
AMeD syndrome is characterized by aplastic anemia, mental retardation,
short stature, and microcephaly and is caused by digenic mutations in
the aldehyde dehydrogenase 2 ( ALDH2) a...
Dietary patterns during pregnancy in relation to maternal dietary intake: The Mutaba’ah Study
Dietary patterns during pregnancy in relation to maternal dietary intake: The Mutaba’ah Study
Aim
To relate adherence to healthy dietary patterns, evaluated by different dietary indices, to the intake of nutrients and food groups among pregnant women in the United Arab Emir...
THE FORCING EDGE FIXING EDGE-TO-VERTEX MONOPHONIC NUMBER OF A GRAPH
THE FORCING EDGE FIXING EDGE-TO-VERTEX MONOPHONIC NUMBER OF A GRAPH
For a connected graph G = (V, E), a set Se ⊆ E(G)–{e} is called an edge fixing edge-to-vertex monophonic set of an edge e of a connected graph G if every vertex of G lies on an e –...

