Summary: | JPEG decoding shared memory leak | ||
---|---|---|---|
Product: | libva | Reporter: | Lingzhu Xiang <lingzhu> |
Component: | intel | Assignee: | haihao <haihao.xiang> |
Status: | RESOLVED DUPLICATE | QA Contact: | Sean V Kelley <seanvk> |
Severity: | normal | ||
Priority: | medium | ||
Version: | unspecified | ||
Hardware: | x86-64 (AMD64) | ||
OS: | Linux (All) | ||
Whiteboard: | |||
i915 platform: | i915 features: | ||
Attachments: |
vaapi_jpeg_decoder.cpp
sample.jpg |
Created attachment 115726 [details]
sample.jpg
I find that after I explicitly call vaDestroyBuffer() on all buffers created, the leak disappears. But comment on vaRenderPicture() says "Buffers are automatically destroyed afterwards". The reproducer here followed this documentation and did not explicit destroy buffers. |
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.
Created attachment 115725 [details] vaapi_jpeg_decoder.cpp Platform: Haswell, i7-4600U Kernel: 4.0.0-1-amd64 Distro: Debian stretch/testing Library versions: i965-va-driver 1.5.1-2 libdrm-intel1 2.4.60-3 libva1 1.5.1-2 Steps to reproduce: 1. g++ vaapi_jpeg_decoder.cpp -ljpeg -lva -lva-drm 2. ./a.out sample.jpg Actual output: a.out memory usage almost does not increase. `free -m` shows system shared memory usage increases by several hundred MB per second until full. System begins swapping. OOM killer gets triggered. Valgrind detects no leak. Expected output: No accumulating memory usage Information: I tried to use libva to process a stream of jpeg files. The decoder I wrote reuses surface and other structures across jpeg files. My cursory look in libdrm shows some complex memory management infrastructure which is beyond my reach.