Very basic question: Is there a quick way to median-filter a video?

To be specific, I mean a temporal median filter that calculates a median with respect to time on a pixel-by-pixel basis. It's SUPER good at getting rid of noise while preserving signal features, but holy cow it's computationally expensive.

Here's a simple python/numpy script for what I mean:

def denoise_median(input_video,WINDOW):
    print('removing extreme spikes...')
    num_frames=input_video.shape[0]
    video_denoised=np.empty_like(input_video)
    for i in range(WINDOW//2,num_frames-WINDOW//2):
        if i%40==0:
            print('   ',int(100*i/(num_frames)),'% complete')
        video_denoised[i,:,:,:]=np.median(input_video[i-WINDOW//2:i+WINDOW//2+1,:,:,:],axis=0,overwrite_input=False)

    return video_denoised

[EDIT] move_median() from the "bottleneck" package is MUCH faster.

πŸ‘︎ 6
πŸ’¬︎
πŸ‘€︎ u/Syntaximus
πŸ“…︎ Jan 12 2022
🚨︎ report
Is there a website that exists to sort/filter countries by a bunch of criterias like weather, median income, internet speed, etc ?
πŸ‘︎ 8
πŸ’¬︎
πŸ‘€︎ u/lucid_frog94
πŸ“…︎ Nov 03 2021
🚨︎ report
Gimp's median blur filter can make diamond bokeh backgrounds. reddit.com/gallery/r8g65w
πŸ‘︎ 2
πŸ’¬︎
πŸ“…︎ Dec 04 2021
🚨︎ report
Fast / established method to implement a median filter?

I’m coming back (from Python) to C++ after many years, and am fumbling around trying to implement an efficient median filter for an array of values.

Is there an established / efficient method to achieve this? I’m basically trying to do the same thing as achieved by scipy.signal.medfilt.

There seems to be several algorithms suggested by the community, but none that seem to be well established. Help!

Edit: I should have specified my use case right at the beginning:

My use case is that I want to remove spikes from a signal. By median filter I do mean a sliding window median over a long signal. My signal is of the order of 10^5 - 10^6.

Thank you very much for your feedback, I will take a look at nth_element!

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/besse
πŸ“…︎ Sep 30 2021
🚨︎ report
I did a image noise correction filter using the median filter technique and multiprocessing. All in C! Link to repo in the comments.
πŸ‘︎ 126
πŸ’¬︎
πŸ‘€︎ u/theBadRoboT84
πŸ“…︎ Jun 19 2021
🚨︎ report
[Question] Applying median filter to specific parts only

Hello, I would like to know if there is a way to apply a median filter only to parts of an image (numpy array) where a special value exists, can someone help me out?

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/omar3bdellatif
πŸ“…︎ May 13 2021
🚨︎ report
How to calculated weighted median (not average) with frequency and filter?

https://i.imgur.com/kDF1OA9.png (with 3 columns: X, frequency, filter)

filtered sample will be {3,4,4,5,5,5,5} and the expected result for weighted median = 5

πŸ‘︎ 8
πŸ’¬︎
πŸ‘€︎ u/zacce
πŸ“…︎ Mar 29 2021
🚨︎ report
Median filters in python

I want to apply a median filter to an image, but I only want to apply it at positions where some special value exists. Is there an efficient way to do so without having to loop over the entire image?

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/omar3bdellatif
πŸ“…︎ May 13 2021
🚨︎ report
How would the Median filter perform on small variance noise?

As the title of this post suggests, how would a Median filter perform when an image has small variance noise? Such as that noise created by the different sensitivities by camera sensors? Would it be better to smoothen this noise using a mean filter instead such as the Adaptive Kuwahara filter?

Thanks in advance!

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/XonDoi
πŸ“…︎ Oct 23 2020
🚨︎ report
Where is the median filter?

Hi,

I am using GIMP 2.8.22 and I don't find the median filter anywhere. Does it have another name?

πŸ‘︎ 4
πŸ’¬︎
πŸ‘€︎ u/lethargical46745
πŸ“…︎ Jul 02 2020
🚨︎ report
How do I implement a "5 by 5 Separable Gaussian filter" and a "5 by 5 Median filter"? What is the 5x5?

I was told to:

β€’ Read in a gray level image
β€’ Add Gaussian noise to the image with sigma =10
β€’ Implement:
5 by 5 Separable Gaussian filter with sigma =0.8
5 by 5 Median filter

I use imnoise and imgaussfilt to add noise and a gaussian filter, but I have no idea what the "5x5" portion means. What does this mean and how do I use these methods to accommodate that? medfilt2 has a parameter to make it 5x5 but I don't see that available for a Gaussian filter.

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/RayzTheRoof
πŸ“…︎ Mar 12 2020
🚨︎ report
How to denoise ECG Signal with median filter using WFDB for Python?

Anybody familiar with WFDB application for Python?

So I'm trying to denoise MIT-BIH Arrhythmia Database (mitdb) downloaded from Physionet using median filter. Using WFDB, I can read the signal data with the following code:

 record = wfdb.rdrecord('mitdb/100', sampto=3000) 
 ann = wfdb.rdann('mitdb/100', 'atr', sampto=3000)

Then, when it comes to denoising, I read the WFDB documentation for Python and there is no such function to do median filter, unlike WFDB for Matlab which has the function medfilt
.

I am aware that SciPy Signal has medfilt
function but if I read the signal data with WFDB, how can I input the data in the medfilt
SciPy Signal function? Or should I use another method to read/open the data so I can use medfilt
in SciPy? Or perhaps anybody know how to do median filtering with WFDB in Python?

TIA

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/fhp0223
πŸ“…︎ Aug 05 2020
🚨︎ report
C++ implementation of the paper Median Filter In Constant Time github.com/Ldpe2G/ArmNeon…
πŸ‘︎ 20
πŸ’¬︎
πŸ‘€︎ u/Ldpe2G
πŸ“…︎ Oct 20 2019
🚨︎ report
Median in a pivot table where it changes with filter criteria

Hi All,

This may be a bit of a long shot, but worth an ask. I have found a way to calculate the median exam mark in an excel pivot table using the steps in this article https://www.pryor.com/blog/how-to-calculate-median-in-an-excel-pivottable-almost/

However I have marks across multiple academic years and when I filter on the academic year in the pivot table, the median stays the same. Is there a way to get the median to change with the academic year filter and not show the median across all the years in my dataset?

Thanks in advance

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/Explorer1007
πŸ“…︎ Apr 28 2020
🚨︎ report
Implementing a median filter for sensor data powertronika.com/2020/04/…
πŸ‘︎ 4
πŸ’¬︎
πŸ‘€︎ u/the_555_guy
πŸ“…︎ Apr 25 2020
🚨︎ report
Calculate median taking into account filters

Hi all,

I'm having an issue calculating the correct median that I want. Basically, I have several filters for various attributes, and I also have a conditional dimension filter that is set to exclude certain rows of data that don't meet a certain threshold for one of the metrics - I want this conditional filter to take into account all the other filters, so I added the others to Context in order to do this.

Then, after filtering the data using both the Context filters and the Dimension filter, I want to calculate the median of one of the other metrics (I plan to eventually turn this into a filter where I'm looking at only values greater than the median, but first I have to make sure I can calculate the median correctly.)

However, the way I currently have it set up, the median does NOT take into account the dimension filter. How can I make it so that the median is calculated after that filter is applied?

To help clarify the problem, I've attached screenshots from a sample workbook which is populated with fake data that I generated arbitrarily for the sake of an example. In this example, each row is a pop-punk band and the columns are number of songs played and ticket cost. The columns are meant to be sums over a particular date (which is the context filter in this example). I want to calculate the median number of songs played for that day across all bands. (So basically, the median of a sum. I created a fixed LOD expression to get the sum of songs played by each band and it's called "Total Songs"). The dimension filter is "Cost > Minimum" which is meant to filter out all bands whose total ticket cost for a day is <= $15.

For instance, look at September 1, 2019. The median is correct (15) when the cost filter is not applied (first image). However, when I set "Cost > Minimum" to be true, 3 rows are excluded but the median stays the same - it says 15 when it should actually be 33 (second image).

Essentially it seems that I need things to happen in three steps: first, apply the date filter; second, apply the dimension filter to exclude low-cost bands; third, calculate the median. Is that possible somehow?

For clarity, in my actual workbook, I have more than one context filter (here, it's just date) but I wanted to make a stripped-down version for the sake of an example and to avoid sharing private information from my company (:

Thanks in advance for any advice!

(edited for formatting)

[Before the cost filter (top right) is applied](h

... keep reading on reddit ➑

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/EternalSunshine99
πŸ“…︎ Oct 03 2019
🚨︎ report
Median Filter Implementation in Python github.com/MeteHanC/Pytho…
πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/TemurMelik
πŸ“…︎ Apr 13 2018
🚨︎ report
Broken const-time median filter created interesting results
πŸ‘︎ 7
πŸ’¬︎
πŸ‘€︎ u/Haalonean
πŸ“…︎ Feb 28 2019
🚨︎ report
DAE have extreme difficulty finding a median between overthinking all social interactions and lacking a filter completely and unintentionally upsetting people?
πŸ‘︎ 12
πŸ’¬︎
πŸ‘€︎ u/serenitative
πŸ“…︎ Feb 20 2016
🚨︎ report
I'm not sure where to start with this IEEEXtreme problem (Median Filter/Milli Vamedian)

This apparently was a contest problem in one of the IEEE contests in the past. Now, I have to do it for homework. I'm not sure where to start. If anybody can offer some hints, ideas, explanations I will be most grateful.

The full problem can be read here: http://olympiad.bytefreaks.net/events/c-median-filter

I guess I don't understand which numbers to find the median of. The inputs and outputs don't make exact sense to me. Also, how would I store and calculate it? I am fairly versed in c++ and java (i will be writing this one in c++). I am thinking that might be helpful in determining some part of the STL to use.

Either way, I'm not expecting/hoping someone will write the code for me, but I am looking for some guidance if anyone can offer it.

Thank you.


EDIT: I figured it out. Sorry, I had gotten too entrenched in solving it that I had almost forgot I posted this. I guess I didn't have enough faith in myself and was ready to give up too early. Anyways, the solution involves finding the median of the 9 integers (3x3 matrix) surrounding each integer. So if you're imagining a matrix, and the number is in location [i][j], you are finding the median of the each number in [i-1][j-1], [i-1][j], [i-1][j+1], [i][j-1], [i][j], [i][j+1], [i+1][j-1], [i+1][j], [i+1][j+1].

If the integer is in a location at the end of a row, column, or both (corner), such as [0][0], [0][1], etc., then you use the integers on the edge. So, say a corner of a matrix is as such:

 1  2  3...

 4  5  6...

...

and you are finding the replacement value for 1 in the corner. Then you would have to find the median of the numbers: 1 1 2 1 1 2 4 4 5. Similarly, for 2, the numbers you would need to find the median of is: 1 2 3 1 2 3 4 5 6.

I stored the values in a vector of vectors.

vector &lt; &lt;vector&lt;int&gt; &gt; matrix;

Then I stored each median given each situation (a lot of confusing if else statements) into a normal vector, put those values into a new vector of vectors matrix, and then output the final, new matrix.

This post got more upvotes than I was anticipating, so I'm guessing others are having to do this as well. If you have more questions, feel free to answer, and I will try to help by pointing you in the right direction.

πŸ‘︎ 28
πŸ’¬︎
πŸ‘€︎ u/Skoolz
πŸ“…︎ Nov 04 2012
🚨︎ report
What's the best way to clean a dirty studio backdrop? In photoshop the median filter solbes this very easy imgur.com/LjZfI4s
πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/enkumber
πŸ“…︎ Mar 26 2016
🚨︎ report
Is there a way to apply a median filter directly to a time-varying rotation matrix?

I need to write MatLab script for a project which would take a series of rotation matrices (referring to an actual physical object's orientation) and apply median filter to it to eliminate speckle noise from it.

The way I'm doing it right now is by converting the matrix into three Euler angles, filter them and then use them to construct a new rotation matrix. The problem is that the numerical inaccuracy creeping in during the double conversion seems to be too significant for the filtering to be useful to me. Is there any way to filter the rotation matrix directly?

πŸ‘︎ 13
πŸ’¬︎
πŸ‘€︎ u/Sanuuu
πŸ“…︎ Aug 07 2014
🚨︎ report
The market has dropped -48% from its 90 day high and here I tabulate how many months it had took to recover from such an event.

source: coinbase API. They only have data starting 2016, Jan 1st so this is where I start my analysis.

since 2016 this even has occurred 14 times in monthly bins.

method:

  • I calculated the lagging 90 day high closing price and then figure out how many days and months it took to recover.
  • Currently bitcoin is ~ 48% of the 90 day high so I filter by events this has occur by month.
  • I then figure out how many days or months it took to recover had you bought at the absolute highest.

results:

date close BTC.USD.volume max_90 pct_change recover_days recover_months
2018-01 9014.23 38171.97 19650.01 -54.1 1033 34.4
2018-02 6905.19 59578.6982 19650.01 -64.9 1029 34.3
2018-03 6816.01 15434.5228 17098.99 -60.1 961 32
2018-04 6619.01 10756.5497 17098.99 -61.3 956 31.9
2018-06 5851.66 7769.81165 9800 -40.3 358 11.9
2018-11 3731.32 36455.2237 7360 -49.3 168 5.6
2018-12 3183 9343.2724 6750 -52.8 147 4.9
2019-01 3397.42 8709.6973 6503.12 -47.8 102 3.4
2019-02 3409.57 6032.81895 6503.12 -47.6 96 3.2
2020-03 4857.1 113902.203 10371.33 -53.2 137 4.6
2021-05 34627.82 27999.1507 63588.22 -45.5 143 4.8
2021-06 31594.63 26505.1927 63588.22 -50.3 116 3.9
2021-07 29796.16 18114.1529 58958.05 -49.5 87 2.9
2022-01 35101.33 21310.7209 67554.84 -48

So as you can see, historically this has occurred 14 times since 2016. In 2018 was the worse, because just when you thought you bought at the absolute lowest, the ATH actually shifts and every month it kept on dropping.

Median

the median months it took to recover is 4.9 months or 147 days. The lowest it dropped was 64.9% on 2018, Feb.

Here is a histogram.

https://preview.redd.it/7ul6wj37z9d81.png?width=600&format=png&auto=webp&s=640ecac37cc7b325ffa979954eac0cbbd38b8fc0

TLDR: buckle up; historically it takes about 4.9 months or 147 days to recover from a drop of -40-64%. The shortest it took was 2.9 months and the longest was 34.4 months to recover. The lowest it ever drop was on Feb 2018 about 64.9% and that took 1029 days to recover. The good news is that it will likely recover again.

edit: A few comments below observed that the recovery phase seems to be getting faster so plotted this this as well.

recover time

πŸ‘︎ 6k
πŸ’¬︎
πŸ‘€︎ u/greenappletree
πŸ“…︎ Jan 22 2022
🚨︎ report
Is there a way to filter out posts from people 18 and under? Idgaf about anything anyone younger than 13 has to contribute to reddit. Sort of the opposite of a parental filter?

Like adult swim?

πŸ‘︎ 2k
πŸ’¬︎
πŸ‘€︎ u/TruestJedi
πŸ“…︎ Dec 11 2021
🚨︎ report

Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.