A list of puns related to "Acceptance Testing"
Hi all,
I have been asked to draft a factory acceptance test document along with a testing procedure for some PLC controlled automated equipment. Since I am familiar with the equipment (I designed the system) I have a thorough understanding of what to check before the equipment leaves the facility. However, I have no experience with the documentation and how these things are typically formatted etc. I have found some construction equipment F.A.T. templates online, which seems to be more of the formal sign off sheets , who was in attendance, attach the procedure, relevant documents, etc. I think that should be a good start for the formal sign off papers, but I wouldn't mind seeing what others here might be able to share.
The thing I am most interested in seeing though is the format for factory testing procedures and what those documents might look like. I'm curious to see the level of detail and if any calculations are included etc. I'm starting from nothing and trying to gauge what is typical.
Thank you!
Hi everybody - we have a new project, where we need to facilitate development scalability (probably 10 teams working on the project). We have seen, worked and designed some "microservice" architectures in different projects. Sadly were all of the projects not truely microservice architected, mostly because one or more of the properties of the microservice architecture were not followed (I think this https://www.youtube.com/watch?v=zzMLg3Ys5vI describes correctly what are the properties which make a service a real microservice). The only project which has come to fruition fast and with comperatively little pains, was the one in which we decided to go for the distributed monolith from the beginning. All the other projects were just flawed, because of a usage of some sort of integration of all the services and testing this integrated state, before going to "production" and even some other problems, as shared libraries and such... The project where we used the distributed monolith, has but one limiting factor and it is how many teams could work on such monolith especially following scrum with all the developers, merging changes before the sprint end (yes I know, probably were the user stories not good enough and such, but that's the reality in all the project, which I have seen). So we need the true microservice architecture in order to scale up to ten teams.
So my question is, what could be used to guarantee that the services could talk to each other correctly and mock the other microservices for the acceptance testing. We have worked with CDC in two projects (Pact and Spring Cloud Contract) and for me are these really just coupling mechanism (they just create a lockstep situation, because of shared state), which in fact hinders the autonomy of the microservices - in fact at the end, they were either neglected or just faked (the rule was, that all services should have CDCs). So we have considered, how is this handled in AWS, paypal and other service providers with public APIs - one has an API, which is well defined, but all of them mostly have SDKs, which are handed to the consumers and handle such things as versioning, validating, mapping and such. So our idea is that every microservices, which exposes some APIs should create a SDK and publish it centrally and it should also test it (all major, minor versions of it) in its acceptance tests and the consumers should just use this SDKs for the communication with the corresponing service. In this way we could b
... keep reading on reddit β‘What is User Acceptance Testing (UAT)?
It is a testing method where end-users will be responsible to test the software application in a real-time environment. All the necessary resources and support need to be provided by the testing and development team to the testers (end-users) during the UAT process.
After the testing has been successfully completed, valuable feedback will be provided by the users about the applicationβs functionality, performance, usability and user experience (UI/UX). Based on the feedback provided, the concerned teams can analyse and make strategic decisions based on the feedback provided in order to further improve the quality of the product.
Before using user acceptance testing, following are the eight key prerequisites, which need to be met before initiating the UAT process:
1. Business requirements must be set in place
2. Make sure that the unit testing, integration testing and system testing is successfully completed
3. The application code needs to be fully developed
4. Regression testing must be performed in entirety with no major defects
5. All the defect reports needs to be properly tested and fixed
6. Whole traceability matrix for all the testing
7. Ensure that the UAT environment is fully set-up
8. Sign off mail and obtain the required communication from the System testing team that the UAT can be initiated.
Following are some of the key points that need to be pondered upon while using user acceptance testing:
1. The concerned teams must ensure that the application does not crash at any cost
2. Correct inputs need to be accepted by all the functions in order to provide correct outputs
3. The minimum amount of resource that an application consumes
4. The applicationβs load time needs to be analysed.
A UAT test case plays a pivotal role in the testing process. It includes expected results, execution conditions and a set of test steps that are being developed in order to meet a particular objective. The objective can be verifying compliance with a particular requirement or a particular program path that needs to be exercised.
There is a UAT coming up and my client does not have any defect management tool. We have to track each tester's progress on Google sheets and consolidate all errors. We should also assign retest where required.
How to effectively track everything using Google sheets for 30 users?
I am so confused about small liberal arts colleges. Like Claremont, for example, has an average ACT score of 32 but has an acceptance rate of 10%. Colby has an acceptance rate in the single digits but has an ACT range of 31-34. This is all happening while schools like UPenn, Colombia, Stanford and all of that have similar acceptance rates but ACT scores ranging from 33-35.
Are they not looking at standardized testing scores that much at all? If they aren't are they looking more into GPA, essays or ECs? idk, I just feel that their admissions criteria might be wildly diff from larger (prolly equally prestigious) colleges.
Note: I'm not trying to shit on small colleges lol. I'm just confused about that discrepancy in test scores and acceptance.
Hi Redditors, Normally I am not posting newly released podcast episodes here due to the rules, but I believe this episode is highly interesting for most of you.
I interviewed Kevin Peres from EXFO and if there is one person, who has seen and experienced everything in fiber optics then itΒ΄s certainly Kevin. From personal experience, I can tell you, that he has helped countless people in delivering quality test results and getting paid in time. He is a true genius when it comes to report customization and has as well a deep understanding on the higher layer transmissions which shows in his day-to-day work. We discussed the most common problems inside an optical fiber network. Following this introduction, we went deep on the fiber fault-finding part covering the best practices but also including some ninja tactics.
You find the episode in all podcast directories (Become Fiberminded) or here
Send me a PM to say Hi and let me know your thoughts on it
I have been into user acceptance testing for about 2 years now. The job is great but every now and then I think about the future and I'm pretty sure I don't want to keep doing this. What roles can I jump into with this testing experience? I've heard a lot about automation but I have no clue how to get started (I do know how to code but I'm not that great). Can anyone please help me out?
Thank you in advance.
Liquid penetrant testing simply called as PT. Sometimes Dye Penetrant Testing is a surface defect detection NDTΒ method and is more sensitive to all kind of surface defects, such as cracks , porosity, hot tears, laps, etc.
The NDT methods are trained by NDT Training Institutes with their NDT courses, and the LPT is one among the basic Non Destructive Testing Methods.
https://preview.redd.it/gjoodnqr1z871.png?width=1023&format=png&auto=webp&s=31db07069663bb06729e47ee5fc77f51e3e15519
ASME Boiler and Pressure Vessel Code BPVC- Section V β Article 6 and Article 24
ASTM E-1417 (ASTM Standard)
ASTM E-165 (Penetrant Testing Standard for general Industry ASTM Standard)
ISO 3452-1 (ISO Standard)
SAE AMS 2644 (Aerospace Standard)
AS 2062 (Australian Standard )
DIN EN 571
The Indications are evaluated as per Clients Acceptance Criteria or the Product Standard in conformance with the clients acknowledgement .
https://preview.redd.it/vy3c01jt1z871.png?width=1024&format=png&auto=webp&s=708fba8097b2e210f0c74f58cf893e5ffb3cf628
AWS D1.1 β Structural Steel Welding Standards
API 1104 β Pipeline Welding
ASME BPVC section VIII β Pressure Vessel
ASME B 31.1 β Process Piping
ASME B 31.3 β Power Piping
ASME B16.34 β Valves, and lot more
Acceptance Criteria for Liquid Penetrant Testing -LPT from the above mentioned standards are given below for your convenience.
Indications > 1/16β (1.59mm) shall be considered as Relevant Indication,Β Relevant indications are rejectable if:
Evaluated as crater cracks or star cracks and > 5/32β (3.96mm) in length.
Evaluated as cracks other than crater cracks or star cracks.
Evaluated as IF and >1β (25.4mm) in total length in a continuous 12βΒ (304.8 mm) weld or >8% of the weld length.
For evaluation, the maximum dimension of rounded indications shall beΒ considered as it size.Β Indications are reject able as defects if:
Individual indication > 1/8β (3.17mm)
Distribution of scattered rounded indications exceed concentration APIΒ 1104 fig. 18 or 19.
Clustered rounded indications > 1/2β (12.7mm)
Combined length of clustered indications > 1/2β in any 1
... keep reading on reddit β‘Quick question guys, was calbar suppose to send a separate email for us to sign the above doc, or is it the one that we signed when we registered for the bar exam.
Asking because tomorrow is the deadline for this, and I havenβt received any additional emails from them but found this doc that I signed when I registered. Just want to make sure itβs the same doc.
Even willing to work as low as $500 a week
Hi,
Are there any tips that you would like to share about how to carry out a great control cabinet testing and Factory Acceptance Testing FAT (with IO testing, functional testing, etc)?
Thanks
The Requirements of Ultrasonic Flaw detector equipment as per AWS D1.1 Structural Steel Welding code is given below:
The ultrasonic equipmentβs shall be pulse echo type βAβ scan suitable for use with transducers oscillating at frequencies betweenΒ 1&6 MHz. The equipment shall have valid calibration certificate .The testΒ equipment shall have a calibrated gain control adjustable in discrete 1 or 2 Db steps over a range of at least 60Db.
Transducers/ search units with following characteristics shall be used.
Angle beam: 70Β° ,60Β°,45Β° (within plus minus 2Β°).
Crystal Size: Crystal size shall be square or rectangular in shape and may vary from 5/8 inch to 1 inch (15mm to 25mm) in width and from 5/8 inch to 13/16 inch (15mm to 20mm) in height. The maximum width to height ratio shall be 1.2 to 1.0 and the minimum width to height ratio shall be 1.0 to 1.0.
The nominal probe frequencies shall be between 2 to 2.5 MHz inclusive. Other probe frequencies may be used if material variable such as material grain structure or size dictates.
Transducers shall have an active an active area of not less than Β½ inch2 (323mmβ°)not more than 1 inch2 (645mm2).The transducer shall be round or square. Transducers shall be capable of proving resolution using 3 reflections from IIW V1 Block, as shown below:
https://preview.redd.it/4lmv2f3sjxs61.png?width=537&format=png&auto=webp&s=01011147a94671b74bb0d44934c2a0930072f2c7
PolycellΒ Β jelly, grease, water or equivalent that shall provide satisfactory transmission of Ultrasonic sound waves.
ScanningΒ Surface shall be free of weld spatters surface irregularities or foreign materials that might interfere with the scanning during examination.
7.1. All calibrations and tests shall be made with the reject (Clipping or suppression) control turned off. Use of the reject (Clipping or suppression) control may alter the amplitude linearity of the instrument and invalidate test results.
7.2.Calibration for sensitivity and horizontal sweep (distance) shall be made by the ultrasonic operator just prior to and at the location of testing of each weld.
7.3. Recalibration shall be made after a change of operators, each 30 minute maximum time interval, or when the electrical circuitry
... keep reading on reddit β‘Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.