The best feature your API can have is a great performance – this makes API performance testing an essential feature for your APIs. But what makes API testing different from other performance testing types and how can you to overcome main API performance testing challenges?
It’s a long plane ride to Kuala Lumpur, Malaysia. That’s where SofTec Asia 2017 was held the first few days of August. But, with all that I learned — both through formal presentations and informal discussions with fellow presenters — it was well worth the trip.
Cloud communications software provider Mitel was experiencing various difficulties with new software releases due to lack of sufficient testers on staff. XBOSoft was hired to provide the needed manpower. Over time, XBOSoft's range of services has expanded, with ever greater satisfaction on the part of the customer.
The complexity of account reconciliation software places special demands on the testers of account reconciliation software. The primary challenge: There are two types of accounts to be reconciled, each with its own unique characteristics.
Last week, I presented at the Atlanta Quality Assurance Association — AQAA — on “Managing Agile Software Projects Under Uncertainty.” In the beginning of the talk, we covered the reasons for agile project failure and took a vote as to why people thought or had the experience of failure in agile. The results were quite interesting...
A new age is upon us: the Mobile IoT Period. Driven by the proliferation of mobile devices, and in particular the smartphone, we now face a complex and integrated infrastructure of technologies and concepts called the Internet of Things (IoT), all connected by mobile technology. The challenges we face to ensuring software quality for the myriad technologies that are being shaped by the Mobile IoT Period are daunting. Let’s take a look at the Top 7 Challenges in the Mobile IoT Period. Big Data Big data would not be possible without mobile network technologies to get the data from one [...]
San Francisco, CA (PRWEB) February 8, 2017
The changes sweeping through the healthcare data landscape created substantial challenges for the makers and users of healthcare software. Healthcare IT departments have often struggled to keep up with demands from healthcare providers, patients and government regulators. That’s why software developer Mobile MedSoft has partnered over time with XBOSoft to manage its quality assurance: XBOSoft consistently delivers high quality services and ensures that, when Mobile MedSoft releases a new product, it performs as expected.
Did you watch the Super Bowl? There was an 84 Lumber commercial showing a mother and daughter risking their lives trying to get to the United States from Mexico. Showing the long and arduous journey through the desert, only to face a giant wall separating them from America.Particularly poignant after what the now-president said about Mexican immigrants while campaigning, the ad was purposeful in showing one type of extremely arduous immigrant experience. It was gut-wrenching in a way that made you want to see what happens next to the mother and daughter, which drove traffic to the 84 Lumber site where the rest of the video could be seen. The only problem is that they didn’t predict they would be so successful and all the incoming traffic crashed their site.
San Francisco, CA (PRWEB) February 8, 2017
XBOSoft will mark its Tenth Anniversary at the DeveloperWeek trade show in San Francisco. Founder and CEO Philip Lew and marketing vice president Steve Gorhes will be at booth 241 for two full days (Feb. 14-15) during the run of the show, Feb. 11-16.
The company celebrated the milestone in January with XBOSoft’s employees in Beijing, China, who perform most of the software testing and software QA consulting.
This is the second blog post referencing our recent “Performance Testing Considerations Using JMeter and Google Analytics” webinar. We received quite a few questions we couldn't answer in full during the webinar, so today we're addressing the JMeter specific questions.
Attendees asked XBOSoft VP of Engineering and speaker Ed Curran to go into more detail about his process. He put together responses in a few blog posts; the first of which focuses on Google Analytics-specific questions.
Our very own VP of Engineering Ed Curran presented a great webinar on Tuesday, Jan. 17 on Performance and Load Testing with Apache’s JMeter. We’re included some takeaways for you to utilize for your own load tests, and, for more info, you can watch the webinar in full below, or can view the slidedeck.
Software performance testing services are in great demand by many of our clients, so I’ve recently been tasked with learning JMeter. JMeter is an open source tool, and like many open source tools, there are many resources available for those like me who are trying to learn. I thought I’d share tips on getting started with JMeter as well as some great resources for learning. It took me just five minutes to get JMeter installed and working on my brand new PC without any problem. It’s impressive that JMeter doesn’t require any complicated installation and configuration steps at all (unlike Appium, [...]
In addition to security testing, it seems that performance testing has come to the forefront of requests from out clients. As we now go beyond basic 'it works' functionality, clients are not worried about taking it to the next level. They've come to realize that performance has become more important as expectations continue to increase from their end users. Whereas 3-4 seconds used to be 'good enough', now users expect 1-2 seconds response time and even less for frequent usage scenarios. One of our clients recently told us that they had 10000 users and they wanted to simulate all users logging in simultaneously. After convincing them that this scenario was highly unlikely, I decided to put together a list of performance testing considerations.
Mobile User Experience (Mobile UX) is a common subject these days when it comes to discussing how to keep users coming back and how to keep them engaged. I'll be discussing Mobile User Experience in my full-day workshop, Mobile UX is The New StoreFront, at the Practical Software Quality Conference this August 19 in San Diego. What many don't realize is that mobile UX is not just about placing buttons in certain places and having good contrast so people can see (usability), but more about providing an integrated experience specific to the mobile platform and specific to the tasks your users are trying to get done. For performance testing in mobile applications, there are many acceptance criteria or measurements you should examine, but we think these two are the most critical.
With the increase in smart phone usage around the world, mobile network performance testing is gaining importance. Up to 60% of mobile failures are performance related and Google found that a .5-1 second increase in page load time resulted in a 20% decrease in traffic and revenue. We did a series of 3 webinars on the different aspects of mobile performance testing: Part 1 - Client Application performance Part 2 - Server performance Part 3 - Network performance Here is a recording of part 3: covering mobile network performance testing. This webinar discusses: Causes of network related problems What tools and services [...]
Rendezvous Software Performance Testing is a function within LoadRunner used to simulate different virtual users waiting at a rendezvous point to gather and run at the same time. This is also called concurrency. What else can Rendezvous do?
Performance is a major issue in mobile user satisfaction. Testing the performance of your mobile applications and using the results of these tests to improve the performance is a key aspect of successfully delivering mobile software.
On August 8 2012, together with BlackLine Systems, we held a webinar on Performance Testing and Test Automation Best Practices entitled: How to Achieve and Maintain High Quality SaaS Software in the Cloud Donna McCollum from Blackline Systems and Philip Lew from XBOSoft discussed lessons learned and best practices on how to set up test automation and performance testing for cloud software that is delivered with a SaaS model. If you'd like to keep updated on upcoming webinars on performance testing or test automation or other QA related topics please follow us on twitter: Follow @XBOSoft
According to this chart from StatCounter, that depicts the total US internet traffic (as measured by StatCounter), in June 2012 approximately 10 percent of all Internet traffic is coming from mobile devices. And the trend is up. This should not come to a surprise. I assume we all experience using our mobile for more and more online activities and I would not be surprised if many of our readers already spend more than 10 % of their total online time on one of their mobile devices. We see the same happening with our clients and the attention they direct towards [...]
Fiddler is a Web Debugging Proxy tool which logs all HTTP(S) traffic between your computer and the Internet. Fiddler allows you to inspect traffic and can be used by developers to debug web programs. It can also help testers inspect and examine the traffic between the user side of a web application and web server. I’ll just talk about the tool from a tester perspective, and discuss some simple uses that we’ve run across.
After you’ve done all your homework and determined user scenarios and load profiles, you fire up your performance tool and begin to simulate those scenarios. If using LoadRunner, when you record a session and then execute it, you have probably encountered errors such as “The same user can not login twice. “ or “ One user can only poll once”. When this happens, this basically blocks your performance test and stops you from moving forward. These errors occur because a server cannot accept the same user executing the same action twice because many applications are designed to have precise statistics from each individual user, so if one user logs in 10 times (as you may try to simulate in a load test), the results do not conform with what the application expects and error messages are returned. Everyone encounters these types of problems and there are numerous ways to troubleshoot them. One such method is using correlation. Correlation is a process that ...
Analyzing performance test results is very critical for performance testing. Any results must be interpreted in light of the test’s objectives or they will have little meaning. For example, an objective might be to measure the maximum load that a server can handle (stress testing), verify that a certain number of concurrent users can stably run for a certain time, such as 1000 mail users can run 24 hours on a mail server (CHO – continuous hours of operation or endurance test), or how many users can the system handle executing multiple scenarios and maintain an average response time under X seconds (boundary test). When running performance tests, it’s important to note that performance tests are highly dependent on many variables...
We've been testing mobile applications in the field as well as in the lab for one of the largest mobile device manufacturers in the world. We test in the field, in different countries to determine how different carrier networks, applications, and tasks, impact the user experience. When doing mobile performance testing, there are any number of factors that can be measured and tracked. They all will vary depending on the device, it's configuration, network, task being executed (what software and what action), etc.
For testing the performance of mobile application software, the testing focus is different than when testing general computer software due to the nature of embedded software systems. Embedded software is specially designed for each device, so software (including OS and application) is closely tied to the hardware. Some typical complaints for using mobile application software may be: