A list of puns related to "Isotonic regression"
I was hoping to get some feedback on my first attempt at a Rust library, an implementation of one of my favorite algorithms. It's called "pair adjacent violators" and it's useful in many machine learning applications.
I'm the author of a similar library in Kotlin, although this is a from-scratch implementation where I'm trying to stick to Rust idioms, while also removing some unnecessary complexity from the Kotlin version (specifically the spline implementation - which, while fun, was overkill).
Since I'm new to Rust I'm particularly interested in ways I might have deviated from Rust idioms, but all constructive feedback is appreciated.
Introducing Torchsort, an implementation of "Fast Differentiable Sorting and Ranking" (Blondel et al.) in PyTorch, complete with a custom C++ and CUDA kernel for fast performance.
pip install torchsort
https://github.com/teddykoker/torchsort
Differentiable sorting and ranking operations open the door to new loss functions. For example you can easily implement Spearman's rank coefficient using Torchsort, and have a model learn to output predictions with a monotonic relationship to the targets:
import torch
import torchsort
def spearmanr(pred, target, **kw):
pred = torchsort.soft_rank(pred, **kw)
target = torchsort.soft_rank(target, **kw)
pred = pred - pred.mean()
pred = pred / pred.norm()
target = target - target.mean()
target = target / target.norm()
return (pred * target).sum()
pred = torch.tensor([[1., 2., 3., 4., 5.]], requires_grad=True)
target = torch.tensor([[5., 6., 7., 8., 7.]])
spearman = spearmanr(pred, target)
# tensor(0.8321)
torch.autograd.grad(spearman, pred)
# (tensor([[-5.5470e-02, 2.9802e-09, 5.5470e-02, 1.1094e-01, -1.1094e-01]]),)
The algorithm itself is O(n log n), and runs quite fast on CPU and GPU (even with large batch sizes and sequence lengths) thanks to the custom Isotonic regression kernel. I hope this is helpful tool for the ML community!
Edit: Can (Knowledge) distilled model produces **well calibrated** probabilities?
Knowledge distillation(https://www.cs.toronto.edu/~hinton/absps/distillation.pdf) uses a modified softmax with a temperature hyperparameter to produce soft labels, and a student(small) model was trained using the soft labels and the ground truth labels.
My question is that seems the introduction of the temperature, will drastically distribution of the probablilty of the output. So for tasks that need to produce accurate probability (For example, ctr in recommendation system), rather than classification, distillation might not be a good producer of good probabilities?
What are some good ways to improve the accuracy of probability? I am thinking about have another isotonic regressions to calibrate (https://www.cs.cornell.edu/~alexn/papers/calibration.icml05.crc.rev3.pdf).
The funeral director was asking us what we think Mum should wear in her casket.
Mum always loved to wear sarongs (fabric wraps that go around the torso and drape downward a bit like a long skirt would), so my uncle suggested that she wear a sarong in there.
The funeral director looked a bit confused, as did some of our family members, to which my uncle added:
"What's sarong with that?"
I started laughing like an idiot. He was proud of it too. The funeral director was rather shocked. We assured her, and our more proper relatives, that Mum would've absolutely loved the joke (which is very true).
His delivery was perfect. I'll never forget the risk he took. We sometimes recall the moment as a way help cushion the blows of the grieving process.
--Edit-- I appreciate the condolences. I'm doing well and the worst is behind me and my family. But thanks :)
--Edit-- Massive thanks for all the awards and kind words. And the puns! Love 'em.
I would have a daughter
But Bill kept the Windows
True story; it even happened last night. My 5-year-old son walks up behind me and out of the blue says, "hey."
I turn to him and say, "yeah, kiddo? What's up?"
He responds, "it's dead grass."
I'm really confused and trying to figure out what's wrong and what he wants from me. "What? There's dead grass? What's wrong with that?"
.
.
.
He says, totally straight-faced, "hay is dead grass," and runs off.
You officially hit rock bottom
No it doesn't.
And then you will all be sorry.
Now itβs syncing.
He replied, "Well, stop going to those places then!"
I will find you. You have my Word.
She said how do you know he was headed to work?
βthank you for your cervix.β
...sails are going through the roof.
Made me smile
Mods said I'm a cereal reposter...
A taxi
But now I stand corrected.
Wait. Sorry, wrong sub.
Wookie mistake.
Theoretical Fizz-ics
Because you canβt βCβ in the dark
I said, βThat makes two of us.β
so I had to ground him.
He's doing better currently.
And conducting himself properly.
Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.