A list of puns related to "Radial basis function kernel"
Anyone tried using multiple linear regression with radial basis functions to approximate the future matket price? You could take two weeks of data for example as Y and have X1 as the sum of two radial basis functions centred on each Monday, X2 for the two Tuesday's.... ext. Then use the multi linear regression to calculate weights for each radial basis sum to produce a function to approximate the future market price.
I have been searching for days now and I know that there is a formula, however, everybody uses different notation and don't actually give a coherent formula. I know it has to do with the derivative of the error with respect to the weight but I don't know how to find this. Many people online also talk about linear regression but just say that and nothing else. They never say how to apply it to the network to find the weights. Any help would be appreciated!
DOI/PMID/ISBN: 10.1016/B978-0-12-811318-9.00028-4
https://math.stackexchange.com/questions/1373097/why-does-minimizing-hf-sumn-i-1y-i-fx-i2-lambda-pf-2-lea
https://math.stackexchange.com/questions/1897266/how-does-one-derive-radial-basis-function-rbf-networks-as-the-smoothest-interp
How does one derive that RBFs are the optimal way to fit the data such that the learned curve is the smoothest (has small gradients)?
I'm working on a continuous state and action Q-learning algorithm, using radial basis nets to store the value function. It works OK, but I'm not completely happy with the way I'm doing the argmax to get the action. Basically I have a radial basis function and I have to find maximum. The function looks like this more or less.
Things I've tried so far:
PSO optimization. Works ok, but too slow.
Golden section search. Not that fast either, and prone to stuck in local maximum.
Polynomial curve fitting. Basically fit a poly curve and then find the max root value. It's analytical, so it works pretty fast, but for low number of poly coefficients it's not very accurate, and for high number you get overfitting problems like you can see on the graph.
The function itself looks pretty simple as you can see. So what else can I try?
Happy new year everyone.
In the field were I work (compartmental modelling of drugs), a group came up with a method to fit data, that gained quite some popularity, which they called "radial basis functions fitting". To me, instead, it doesn't make sense and it works just by accident. I wanted the opinion of someone with better understanding of the underlying theory.
Basically, we have a system which response is y(t) = k1* x(t)+k2* conv(x(t), exp(-k3*t)), where t is time, y is the output value that we measure, x is the input (a function of time) and conv is the convolution operator. Ks are the 3 parameters we need to fit. Actually what we really want to measure is B = k1+k2/k3 -1.
What we used to do was to have x(t), measure y(t) and fit the 3 k minimizing the chi square in a non-linear fit. Sadly, this fit generally had lots of noise.
The proposed "radial basis function" method works by computing theta_i (t)= conv(x(t), exp(-k3_i*t)) for a multiple k3 values, indexed by i, finely sampled over the whole range of values that k3 might assume in a physical situation. Then, for each i, they do a linear fit of k1_i and k2_i fitting y(t) = k1_i *x(t) + k2_i * theta_i(t), and save the chi-squared. Then, they keep the triplet of k1, k2 and k3 for the theta_i that gave the minimum chi squared.
Now, to me this is absolutely unneeded. The only reason that it gives improvements over the standard "usual" fit is that, as x(t) is approximately a linear slowly varying function, k2 and k3 are strongly correlated and therefore our k1+k2/k3 -1 final parameter is almost not influenced by the correct fit of k2 and k3.
Any thoughts? Has anyone ever heard of this method applied in other settings?
I have a function, k(x,y), from AxA to R where A is a compact subset of R. Suppose k(x,y)=k(y,x) for all x,y in A and that \int k(x,y)^2 dxdy is finite. The operator K defined Ku=\int k(x,y)u(y)dy is therefore self adjoint and Hilbert-Schmidt, so there is an orthonormal set of eigenfunctions, v_n , with real eigenvals.
Is there necessarily a countable set of eigenfunctions? If not, what else do I need to assume to conclude this?
Also, given any function u in L2(A), can I write u(x)=sum_n <u,v_n > v_n where <,> is the L2 inner product? If not, what else do I need to assume to get this?
I guess I forgot all my functional analysis and can't quite find the right theorem through googling. Thanks for any help!
I have created a normalized radial basis function network with the hope of being able to forecast the following data.
My input (X) is a date/time, currently converted to long but only because I'm not sure on how to encode this and the output (Y) is the amount of coal used throughout this time frame.
I got this data from http://www.gridwatch.templar.co.uk/ and it's from the dates 2011-2015.
After training the network I obtain the Weights, Centers and Beta values, which can be used to evaluate the performance of the network, by running the code I currently have I produce this graph.
http://i.imgur.com/lEqPmW1.png
After running my training algorithm I would now like to be able to forecast the following year - 2016, however I have no idea (beginner) how I would go about doing so?
Any advice would be appreciated.
I'm just starting to look in to Radial Basis Functions as a means to classify some data I've collected (joint angle and velocity). This is time series data, at the moment. Meaning, at a specific sample, I have a joint-angle and instantaneous velocity measurement.
Most examples of RBF I've found show a x-y plane with groupings of data... and the goal is to place data points in to one of n-groups.
I've not seen any examples of how to classify the sorts of data that I've got though...
I do know that this is basically a Gaussian Mixture Model... but that's as far as my searching has gotten me so far.
Any help would be appreciated (including letting me know if this isn't the right place to look).
I'm currently performing some research that has lead to me implementing a radial basis function network.
I've already tried simple k-means clustering for the centers, with static variance, and either pseudoinverse, or backpropagation training of the output weights. This didn't work at all. I've since implemented training of the rbf nodes (variance and centers). It works for small datasets, but not for larger, more complex datasets.
I've been googling for various references, but most of the papers and websites that I find are sorely lacking in detailed information. I've seen various mentions to using genetic algorithms for training the rbf nodes, but no details. Same story for using Self Organizing Maps.
If anyone knows of some quality papers or resources for radial basis function networks, they would be highly appreciated. Actual code is perfect for me. Detailed explanations are the next best resource.
Hi guys, I wanted to learn more about drivers, so i started reading Linux Device Drivers.
Where is the reference to all the functions that the kernel provides? and the structs in it?
these kind of functions
up til now I've just been googling and reading from random websites
https://preview.redd.it/ifxnfxcef3b81.png?width=800&format=png&auto=webp&s=645d3c24f77379f8d40be943c6dcd731da0871db
I have an application and I'd like to measure its page cache hit/miss numbers. I found this tool called cachestat by Brendan Gregg, which uses the ftrace
tool to count the number of 4 specific functions (mark_page_accessed()
, mark_buffer_dirty()
, add_to_page_cache_lru()
, and account_page_dirtied()
) to find out the total number of page cache accesses and page cache misses. Detailed information can be found here, but basically, total number of page cache accesses and page cache misses can be calculated as below:
>total number of accesses = number of mark_page_accessed()
- number of mark_buffer_dirty()
total number of misses = number of add_to_page_cache_lru()
- number of account_page_dirtied()
When running on my system (tested on two systems, with kernel version 3.10 and 4.18 each), the hit rate becomes a negative number because for some reason, the number of add_to_page_cache_lru()
is larger than the number of mark_page_accessed()
, while the number of other two functions are practically zero (0~10 calls every 5 seconds, compared to hundreds of thousands ~ millions for the other two functions)
As Brendan explains it in this thread, different kernels may use different kernel functions to access (or insert to) the page cache. I think that if I know what other functions are invoked upon page cache accesses/misses/hits, I could potentially modify the provided script to work on my kernel. Unfortunately, I have little to no knowledge about Linux kernels' functions, and would like your help on determining which functions are invoked upon page cache accesses.
Thanks!
P.S. It's my first time writing here, I hope it's OK to crosspost this question to StackOverflow and maybe r/kernel as well. Also, I have checked this post from 5 years ago which pretty much asks the same thing, just in case someone was wondering.
std::vector but parallel so each cuda thread has its own vector efficiently using vram.
std::map but in parallel so each cuda thread can map things independently and efficiently.
std::thread but as a syntactic sugar around dynamic parallelism kernels of CUDA
std::queue but has efficient communication between cuda threads
Even some emulation of filesystem commands as if memory is file. So that opening a file actually does mmap and binds it through pcie on CUDA unified memory.
Similar things like above.
My UnRaid setup began having kernel panics upon boot this week. I haven't updated the OS or changed any settings that I'm aware of.
The system DOES boot after being placed into safe mode with plugins disabled. Does this mean that a plugin is causing the issue? (Likely a plugin that auto-updated?) If so, what is the best way to manually test/disable single plugins while keeping the others intact?
Thanks for the help.
Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.