Bug 90429

Summary: JPEG decoding shared memory leak
Product: libva Reporter: Lingzhu Xiang <lingzhu>
Component: intelAssignee: haihao <haihao.xiang>
Status: RESOLVED DUPLICATE QA Contact: Sean V Kelley <seanvk>
Severity: normal    
Priority: medium    
Version: unspecified   
Hardware: x86-64 (AMD64)   
OS: Linux (All)   
Whiteboard:
i915 platform: i915 features:
Attachments: vaapi_jpeg_decoder.cpp
sample.jpg

Description Lingzhu Xiang 2015-05-12 21:46:34 UTC
Created attachment 115725 [details]
vaapi_jpeg_decoder.cpp

Platform: Haswell, i7-4600U
Kernel: 4.0.0-1-amd64
Distro: Debian stretch/testing
Library versions:
i965-va-driver 1.5.1-2
libdrm-intel1 2.4.60-3
libva1      1.5.1-2

Steps to reproduce:
1. g++ vaapi_jpeg_decoder.cpp -ljpeg -lva -lva-drm
2. ./a.out sample.jpg

Actual output:
a.out memory usage almost does not increase.
`free -m` shows system shared memory usage increases by several hundred MB per second until full.
System begins swapping. OOM killer gets triggered.
Valgrind detects no leak.

Expected output:
No accumulating memory usage

Information:
I tried to use libva to process a stream of jpeg files. The decoder I wrote reuses surface and other structures across jpeg files. 

My cursory look in libdrm shows some complex memory management infrastructure which is beyond my reach.
Comment 1 Lingzhu Xiang 2015-05-12 21:47:04 UTC
Created attachment 115726 [details]
sample.jpg
Comment 2 Lingzhu Xiang 2015-05-13 18:23:43 UTC
I find that after I explicitly call vaDestroyBuffer() on all buffers created, the leak disappears.

But comment on vaRenderPicture() says "Buffers are automatically destroyed afterwards". The reproducer here followed this documentation and did not explicit destroy buffers.
Comment 3 haihao 2015-11-26 03:08:58 UTC

*** This bug has been marked as a duplicate of bug 75287 ***

Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.