A list of puns related to "Concurrent computing"
What is the difference between ECSE 420 Parallel Computing and COMP 409 Concurrent Programming?
Is there an order I should take them in? Should I take them in the same semester? Different semesters? Should I not take COMP 409 (it's an elective) since I have to do ECSE 420?
Hi All,
Is anyone doing a diploma in computing alongside their b-commerce degree? How are you guys finding it- has been useful in terms of enhancing employability and relate those skills to B-Commerce?
Is it possible to finish both in 4 years whilst doing 3 subjects per semester?
If you are not doing B-com, how are you find the diploma?
Any comment is appreciated, thanks in advance! Hope everyone is almost finishing their exams and having a well-deserved break.
I found these 2 books
https://www.manning.com/books/parallel-and-high-performance-computing
https://www.manning.com/books/c-plus-plus-concurrency-in-action-second-edition
I've heard very good things about the 2nd book (C++ Concurrency in Action), but the first book (Parallel and High-Performance Computing) seems to cover more scope and some major libraries. Has anyone read the 1st book (or both), and have a recommendation or insight?
Thanks!
Just thought that it's an interesting topic and that I should know more about it. Something with tutorials or an applied aspect would be preferred :) . TIA
Hi everyone! I've been programming for a long time, but I've just recently been getting starting in the area of high performance computing/parallelism and what not. I haven't taken any classes on the subject, so I'm not sure if what I'm doing is really optimal. Does anyone have any good references for a "best practices" sort of thing with regards to concurrent programming? I'm also open to recommendations for textbooks!
Thanks!!
Hi, I am hopefully going on exchange to NUS in August next year. I am doing a mechatronics engineering major and while at NUS for the semester I would like to do some computer science subjects. The computer vision and pattern recognition course sounds particularly interesting however I can see it has the Data Structures and Algorithms course as a prerequisite and my university only offers the course in the same semester so I can't take it beforehand. For people who are in similar fields (ComSci, Software Engg, Electrical Engg e.t.c.) do you think it would be reasonable to take these courses concurrently or is the Data Structures knowledge very foundational to the Computer Vision subject from the beginning? Does NUS enforce prerequisites? And out of general interest, what other subjects would you recommend if you have a background in mechatronics or related areas like ComSci (these are usually my favourite subjects in the major). I was thinking about also taking the intro to machine learning but then I need at least one more and some backup options. Thanks for the suggestions!
Hi!
I'm looking get a strong grip on the inner workings of CPUs, threading, concurrency, instructions , SIMDs and etc..
Right now I have a shallow understanding of the matter, Im looking for a new intro book into the whole thing, nothing very very technical - as I am starting to program with RUST and get to grips with Assembly language.
Basically, how does programming gets executed on the physical level, and how it all comes together to create these fascinating machines. Thank you!
https://github.com/vishen/go-chromecast
Has anyone used this before? I've been looking for open hardware/source replacements for our Google Home devices (they all are already using fake accounts with no personalization, privacy turned off - but Google being Google, it's all still shared anyways).
go-chromecast
looks to be a way to keep using some Google Home devices as dumb devices (with fake locations).
We are also using a few Chromecast devices on the TVs, isolated on their own VLAN (along with the Google Homes). This go-chromecast
looks very promising, especially when mixed with some self-created voice-command tooling.
I didn't see it listed on the Awesome Selfhosted list, hence the post.
Alo sirs. I have been waaay overthinking this small project for reasons relating to PTSD post Ransomware. What's the right and safer way to go about this? Do i have to purchase RDS cals? Only going to have 3-4 concurrent users at most. Running Win 10 Pro on the "host". Also, have DC. Is just installing RDS role on there necessary? I could use some good direction here fellas, thanks a bunch
Edit** for all those who called this a conspiracy theory and witch hunt.
look at the text in the middle of this picture.
https://preview.redd.it/fdwcozw3qgw71.png?width=3840&format=png&auto=webp&s=262656f215a269cfcd5e79ef3f00fe766a5aea6a
The owner of this site has temporarily banned you. HSTS protocols are set up and configurable in Cloudflare in the HSTS panel. You can throttle scale and even turn to throttling off.
They are at the control panel. I have so much shit ti say but this post is longer than most care for. This is screwed-up gang.
if you want to see the epic emotional cancer thats going on dig through r/kucoin no one ever mentions gains. ......
Report them to reddit! Help me save crypto noobs from being harvested like explosion for preproduction on a Michal Bay film
Here is a link to part 2. I responded to u/Johnny_KuCoinhttps://www.reddit.com/r/CryptoCurrency/comments/qf4ka4/followup_on_kucoin_cloudflare_and_more/
***Edit ***
TLDR summary
The crux is they don't spend money on It and make money in doing so.
Ask the exchange(s):
While they may say "we dont make money indirectly off insurance funds" they absolutely do.
its your right as an investor to have this detail You have every right to know the details of an insurance fund you are paying into.
Since everyone accepts that a lot of exchanges do this, other exchanges do it to. I literally have screenshots of conversations that say this much.
You are being throttled out. They can indeed scale up at a cost.
If for some reason they can not they have a fiduciary duty the moment they take your funds to tell you the risk of their incapable IT architecture and settings. Moreover, they could just install a kill switch that ends trades without penalty if the web servers go down or they exceed band width.
As cost-effective as it is to build in a kill switch as a solution its not profitable to exchanges that are having a liquidity crisis. Assets on exchanges are becoming more scarce. (reference IEP 1559 and many other facets)
If an exchange restricts your access they should still not be placing higher priority orders via the OTC desk while you are locked out. This should also be disclosed.
While they may say we dont make money indirectly off insurance funds they absolutely do.
Cloudflare is the brand of e
... keep reading on reddit β‘I did a lot of video encoding to get some numbers that may be useful to some Plex server admins here. Enjoy y'all.
^(Yes, I did format it as a research paper. No, I'm not sure why. No, I have no idea if that makes it better or worse.)
Video compression is a science of art. It's math that's viewed subjectively, ephemerally, and smeared 20 to 60 times per second. So it's no wonder that we argue all the time about settings without being able to quantify the way video makes us feel. I'm not going to present anything to change your mind.
TL;DR at the bottom. Read the whole thing anyways, it's a fantastic mad ramble.
So I got bored one day and wanted to know, "how much does transcoding a file in Plex hurt the quality?" Pretty simple question, right? How bad can it possibly be? So I grabbed a video in my library, encoded it, and watched it again. Didn't look too bad. But then I realized it was already compressed from a higher quality source, so maybe it was so low quality that I didn't notice how bad it was? So I encoded it again, same settings. And it still looked file.
That's when I remembered, if my server transcodes it uses an Nvidia 1060 to encode. Maybe the GPU makes it look worse? I watched a few minutes of it, making sure the GPU was transcoding, and again, didn't notice a problem. So I did what any sane person would do - I grabbed a bunch of different files, set up a bunch of machines in my homelab, and started encoding like my life depended on it.
Thanks to some previous research, I know that there's some math out there to actually quantify the difference in quality between reference and compressed video. Peak Signal-to-Noise Ratio is the classic, and Structural Similarity Index Measure was made for exactly this. And on top of that, noted Internet Content Delivery Company Netflix developed VMAF for their entire library of content. So I used those three metrics to compare the 450 final encodes I created.
You can find my encoding/calculation scripts, encoder presets, and ramblings at this github repo. In short, I selected 9 videos to serve as "sources" for comparison:
^(Note: these are numbered 0 to 8, but reddit's markdown start
... keep reading on reddit β‘Has anyone been apart of/ is currently in the concurrent masters of science program in computer engineering? Can you tell me more about it and how competitive it is to get into it? Iβve already met with my advisor and the head of the concurrent ECPE program to discuss it but if anyone has anything they can tell me about it lmk!
Let me list some concepts of computing/programming:
Is there a single source of truth for all of the above? It may be a personal problem of mine in not understanding them correctly, but I get the feeling every language treats them differently in different situations, and they are treated differently across languages too. We as a community treat them differently, and seem to misunderstand them on a constant basis. For instance, I constantly hear people correcting each other with "parallelism is not concurrency" or similar stuff. Do we actually know these concepts, as an industry/field?
Here are other concepts that are architecture-specific that we tend to use in programming languages:
Do the architecture-specific concepts relate or define the computational ones? Is "concurrency" 100% tied to concepts like "thread" or"shared memory between threads"? Are constructs like "yield", "await", etc, tied 100% to the operative system? Don't languages like Haskell and the like make them independent of architecture and make them just a part of the language? What exactly is part of the language and what is "external"?
Here are other concepts about computation in general:
These latter concepts about computation do have a single source of truth, specifically in math. The Turing machine is the same for everyone, and every language can be translated into one. The concept of computation is the same for everybody. Is there something similar for all the other concepts? Something we can point to as the "universal" definition to be used to in any programming language to be designed? Something that we can use as a common language and common model when discussing concurrency via locks in C, or via actors in Erlang, or via coroutines in Go? Something that works just as well when applied to LISP as it would when applied to Assembl
... keep reading on reddit β‘I was wondering if anyone had any insight on this. Operating Systems (CS537) is a prereq for Introduction to Computer and Information Security (CS642). I can see from old slides that one of the first chapters is OS security.
I would have to get permission to join, but I don't know if it's worth it. On one hand, taking them both seems like a good thing, as I'd be forced to become familiar with some OS concepts to catch up in 642. On the other hand, I don't want to be totally lost.
If it helps, 537 is being taught by Prof. Venkataraman this semester and I've heard he's a good professor. 642 is being taught by Prof. Earlence Fernandes, who is teaching 642 for the first time.
A.NET (A#/A sharp) A-0 System A+ (A plus) ABAP ABC ABC ALGOL ACC Accent (Rational Synergy) Ace DASL (Distributed Application Specification Language) Action! ActionScript Actor Ada Adenine (Haystack) AdvPL Agda Agilent VEE (Keysight VEE) Agora AIMMS Aldor Alef ALF ALGOL 58 ALGOL 60 ALGOL 68 ALGOL W Alice (Alice ML) Alma-0 AmbientTalk Amiga E AMOS (AMOS BASIC) AMPL Analitik AngelScript Apache Pig latin Apex (Salesforce.com, Inc) APL App Inventor for Android's visual block language (MIT App Inventor) AppleScript APT Arc ARexx Argus Assembly language (ASM) AutoIt AutoLISP / Visual LISP Averest AWK Axum B B Babbage Ballerina Bash BASIC Batch file (Windows/MS-DOS) bc (basic calculator) BCPL BeanShell Bertrand BETA BLISS Blockly BlooP Boo Boomerang Bosque Brainfuck Bucket C C β ISO/IEC 9899 C-- (C minus minus) C++ (C plus plus) β ISO/IEC 14882 C* C# (C sharp) β ISO/IEC 23270 C/AL CachΓ© ObjectScript C Shell (csh) Caml Cayenne (Lennart Augustsson) CDuce Cecil CESIL (Computer Education in Schools Instruction Language) CΓ©u Ceylon CFEngine Cg (High-Level Shader/Shading Language [HLSL]) Ch Chapel (Cascade High Productivity Language) Charm CHILL CHIP-8 ChucK Cilk (also Cilk++ and Cilk plus) Control Language Claire Clarion Clean Clipper CLIPS CLIST Clojure CLU CMS-2 COBOL β ISO/IEC 1989 CobolScript β COBOL Scripting language Cobra CoffeeScript ColdFusion COMAL COMIT Common Intermediate Language (CIL) Common Lisp (also known as CL) COMPASS Component Pascal Constraint Handling Rules (CHR) COMTRAN Cool Coq Coral 66 CorVision COWSEL CPL Cryptol Crystal Csound Cuneiform Curl Curry Cybil Cyclone Cypher Query Language Cython CEEMAC D D Dart Darwin DataFlex Datalog DATATRIEVE dBase dc DCL (DIGITAL Command Language) Delphi DinkC DIBOL Dog Draco DRAKON Dylan DYNAMO DAX (Data Analysis Expressions) E E Ease Easy PL/I EASYTRIEVE PLUS eC ECMAScript Edinburgh IMP EGL Eiffel ELAN Elixir Elm Emacs Lisp Emerald Epigram EPL (Easy Programming Language) EPL (Eltron Programming Language) Erlang es Escher ESPOL Esterel Etoys Euclid Euler Euphoria EusLisp Robot Programming Language CMS EXEC (EXEC) EXEC 2 Executable UML Ezhil F F F# (F sharp) F* Factor Fantom FAUST FFP fish FjΓΆlnir FL FlagShip Flavors Flex Flix FlooP FLOW-MATIC (B0) FOCAL (Formulating On-Line Calculations in Algebraic Language/FOrmula CALculator) FOCUS FOIL FORMAC (FORMula MAnipulation Compiler) @Formula Forth Fortran β ISO/IEC 1539 Fortress FP FoxBase/FoxPro Franz Lisp Futhark F-Script G Game Maker Language (Scripting language)
... keep reading on reddit β‘(I personally think we should transition to a 6-hour workday, but that's just me)
Hello r/hashkell !
I'm working on a programming assignment where I'm asked to speed up a BFS type algorithm (a N-puzzle solver to be specific) with parallelism. I've tried using parMap to evaluate the next the potential moves and their costs in parallel. But this has only made my program slower. Is there something wrong with my approach?
I'd also like to run the actual path finding recursion with all candidates (or the best two) at the same time but I've heard this isn't really possible due to the function returning an IO(). Is there anything I can do to make this part of the program go faster?
Ideally I want something like this:
for each state in possible_future_states:
new thread(target=BFS, args=variables updated with this state being chosen)
Thank you for reading this. Any advice is greatly appreciated.
EDIT:
This is my current attempt
Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.