A list of puns related to "Josephson Voltage Standard"
In a standard 5 year old US house, I am planning to have a single phase "240 volt 30amp" line installed by an electrician to run multiple rigs on one 1 circuit
I understand i need a pdu and I am searching APC brand. 150+ of their models are 200, 208, or 230v input. Would any of these work in my case? There are only 4 models listed at 240v input, why so few, am I missing something? Should I ask the electrician for a different type of circuit?
Also is there a preferred plug/outlet receptacle type for miners?, I saw nema L6-30p as a possible option.
Thanks in advance!
After a successful first off-grid setup (Small mountain refuge), my next project is to take our tiny-house off-grid.
I've estimated our anticipated system requirements:
PV/Batteries As per my previous post, I have a large stock of 37v 12A Li-ion e-scooter batteries w/ inbuilt BMS(https://www.youtube.com/watch?v=f6kZ2J3LulM)
I'm bumping into two main problems / areas out of my experience, namely:
There seem to be multiple approaches to this type of system:
Questions are:
Thanks kindly in advance for advice and expertise.
FWIW, if anyone wonders why I haven't simply spoken to a Victron / PV reseller, I have. Most don't want to deal with the 'strange' battery requirement, and the others aren't local: so I'd feel badly taking their advice and then buying from another reseller.
On evike it says " 9.6v Small Stick Type or 7.4v stick LiPo recommended" but I've seen some people use 11v
Does anyone know whether the ID.4 standard charger support 240V like the one on the Bolt? Thanks!
1.5, 3.3, 5, 9, 12, where do these numbers come from? Were these just picked by someone at some point and it stuck? do they work better for some reason? are they easy to use? I get most of them are nice and round, but 3.3? why???
Sorry if this is a little off from the sub-rules, it's random but I'm curious and I think this is the best place to ask.
Hello everyone, I had a few concerns about a technique in use at my workplace that doesn't seem very reliable to me. Hoping others may have alternative suggestions.
One of our in-development IoT devices utilizing an NRF52840 has the requirement to report the battery SoC, generate an alert when the battery falls below a certain percentage, and to generate an alert when the battery voltage falls below a certain level. We have an analog input that is able to measure the battery voltage, but no sensors able to do any sort of coulomb counting.
Sampling from the ADC continuously was a non-starter for our power budget, as we need to disable most peripherals besides GPIO & RTC in the application's idle state. The battery chemistry escapes me, I'll edit the post with this information later if I can find the documentation again -- but I can say for sure it is not a rechargeable battery, so we don't need to worry about charge cycles or anything like that.
The technique my colleagues came up with to calculate battery percentage is like so:
We captured a power characteristic for every action and state in the system (and I'm sure there's some conditions we haven't considered) to populate a look up table of average current consumption per second (or per one occurrence of action) values.
The current consumption lookup table + state duration/count of actions are used to accumulate total uA-seconds consumed thus far (to be converted to mAh and then to percentage).
We infrequently poll the battery voltage to check for a low voltage condition (I personally think we should leverage the LPCOMP peripheral to get an interrupt for this instead, but polling is the current implementation)
Relying on this lookup table gives me the feeling that we are destined to accumulate an unacceptable margin of error. Are there any standard techniques to calculate battery soc with voltage measurements alone? I have some experience implementing similar functionality with rechargeable batteries using PMICs like the MAX17260 and ADP5360, but of course those make it all too easy! Any and all suggestions or links to resources on the subject are appreciated.
EDIT: had to fix bulleted list formatting, oops!
Im trying to understand the theory standards for the High Speed Counters, both on plc and sensor sides.
So, in commissioning, I only use the combinnations that my supplyer recommends, so I have no problem. But I want to understand how is exactly the compatibility.
So, talking about sensors, lets suppose encoders. From what i have seen, most of them come with variable 5-24vcc voltage supply or only 24vcc voltage supply.
The 5-24 supply encoders use to have an output signal voltage of whatever you are powering them. I mean, of course they give info by the frequency or repetitions of the pulses, but is the voltage of this pulses important?
The only 24vcc supply have (to my knowledge) an output of 24vcc.
The receivers use to ask for inputs of 5Vcc or 24Vcc. So normally the manufacturers add a resistor to correct the signal, in caseyou have a 24vcc output on the sensor but the receiver asks for 5Vcc.
So, Im finding troubles looking for info about what voltage this has to be in many hardware, particularly the hsc that siemens has on the 1200 and 1500 plcs.
Maybe they are compatible with both signal voltages? Is that even possible?
Any insight or correction to the questions are well received. Thanks!
The title pretty much says it all. I have asked a few people at my work and no one seems to know why 24V is the industry standard for manufacturing control systems. Just curious.
Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.