Showing results for 
Search instead for 
Do you mean 
Reply

Teamcenter - Replicating End-User Experience and generating metrics

This is my first post on this forum, so have patience.

 

Last year, we went live with Teamcenter 9.1 and NX 8.5. Since then, we have been experiencing performance problems with NX. At times, it can take 45 minutes to open a drawing! We have made many tweaks, and are working with our Siemens on-site support staff towards making changes, some that are expensive. Our company VPs want metrics in order to decide on future big changes (example: moving to Oracle Exadata), however, we have yet to be able to provide them adequate metrics, besides the few use cases, using TC/NX journals. My question is, how do others folks automate the user expierence and gather metrics? I would love to replicate a 100 user load on the environment during a holiday weekend, while gathering statistics on each experience and workstation. I understand this may not be one application, rather a group of applications. We have considered developing this in house, however, I wanted start here in an attempt to get some feedback first. Any thoughts?

13 REPLIES

Re: Teamcenter - Replicating End-User Experience and generating metrics

Hello all - this is also my first post - just subscribed yesterday.

 

I'm new to my company (a few months).  We currently use Creo and SolidWorks, no NX; all CAD data currently in Windchill.  Decision made to use TC as primary PLM and migrate all from Windchill to TC.

 

All users are scared of performance problems using CAD (thru integrations) with TC; currently performance with Windchill is excellent.


So - I'm not able to offer any help at this point on this specific question, but will be highly involved in exercises to prove out performance prior to production migration, and have great interest in the topic.  Note: I have set up a fully automated performance test which can be run by any user, for Creo data stored in Windchill.  It exercises all typical steps and can be run as a scheduled job.  Hoping to either find similar or create for Creo and SW with TeamCenter.

 

I'm currently at PLM World in Orlando by the way.

Re: Teamcenter - Replicating End-User Experience and generating metrics

Thanks for the reply! I hope your enjoying your visit to PLM World. Sadly, I couldn't go this year, however, several others from my group are currently there now. Hopefully, they are networking and able to bring back worthwhile information.

As for performance, our user-base is largely unhappy. We have been working for months to iron out performance problems. However, define performance problems!? When we make a change, to reduce the load times from 25 mins to 20, does a user care? On paper, it's a 20% drop, however, in modern computing, is waiting 20 minutes to open something reasonable? It isn't.

With that said, we need actual numbers. I don't want individual use cases. I want to be able to generate reports on the times it takes every single user to open/save data in TC/NX/VIS. I want ranges and averages of times. Right now, unfortunately, we depend largley on running journals to record metrics. This method requires someone sitting in front of the machine and running the journal, which could take an hour. You can quickly see how the time gets eaten up.

 

Re: Teamcenter - Replicating End-User Experience and generating metrics

May not be directly applicable but I'd be happy to show live how my automated test works for Creo data in Windchill.  May either motivate someone to volunteer what already exists for TC or motivate someone to create.

 

The test we run:

- works from a simple .bat file in a cmd window any user can launch

- launches Creo Parametric and registers to Windchill

- searches for a test top leve assy (890 files / 500 MB)

- checks out, downloads (fetches), opens in UI

- exports to local drive, erases local workspace (cache)

- imports from local drive, checks in (saves)

- writes all times to a simple log file

 

So far we have been doing prep exercises using laptops with IPEM and SWIM connected wirelessly to a test TeamCenter instance on a small server.  Times are very slow but not judging this for production.

Re: Teamcenter - Replicating End-User Experience and generating metrics

Actually that does help. Right now, we are investigating ways to do a similiar function, only fully automated, from the Windows login to the start of the TC/NX/VIS journal. We are thinking about leveraging Worksoft Certify to kick off a journal, while kicking off a powershell script that would record local metrics. Once completed, it would report the metrics back to a central area. Like I said, it is not in practice yet. We are still researching OOTB methods we could use. We are also investigating Worksoft and HP's automated testing software (UFT). I am interested in hearing other folk's feedback on this topic. Surely others are performing automated testing!

Re: Teamcenter - Replicating End-User Experience and generating metrics

Unfortunately, instrumenting to collect usable metrics is difficult at best on your current version. You'll need to enable some enhanced logging by making a few entries in TC_DATA\tc_profilevars.bat. However, you won't want to leave them enabled for very long due to the large file sizes they generate. By putting these settings into tc_profilevars.bat they will be enabled for the whole of Teamcenter. They will also drag down performance a little while they're running.

1) TC_Journalling=ON
- It will create a .jnl file in your %temp% directory which can be used for debugging issues related to Teamcenter & its performance 

2) TC_SLOW_SQL=3.0 
- which will list down all the sql query execution which took more than 3 seconds .

3) set API_JOURNAL=FULL
- to enable Teamcenter Integration for NX journalling.

4) set TC_TRACEBACK=ON
- When errors occur, the system writes tracebacks to the system log file by default. Use this environment variable to override the default behavior. set TC_Journalling=ON

5) set TC_POM_JOURNALLING=N 
- Determines whether the POM module journals nested calls

6) set TC_JOURNAL_LINE_LIMIT=0

7) set TC_Journal_Modules=ALL

8) set TC_SQL_DEBUG=BJTP

9) set TC_JOURNAL=FULL

 

These are not all of them but they are probably the most helpful. You can look them up in the preferences guide that comes with the Teamcenter documentation.

 

After running them for a while then you have to go collect the logs. After you collect the logs then you have to read the logs. And that is its own challenge.

 

Target the tcserver.syslog first. It could be as simple as adding a few indexes to Oracle. Good luck!


Randy Ellsworth, Teamcenter Architect, Applied CAx, LLC
NX 11.0.1.mp01 | SW 2016 | TcUA 11.2.3
Evaluating:AW 3.2

Re: Teamcenter - Replicating End-User Experience and generating metrics

BTW, There is a tool called Journal Workbench which can make your job a whole lot easier. Current version is 3.0.0 and its downloadable from the GTAC downloads site. You'll need a license to run it so purchase it from your VAR. Believe me when I say it is worth the small cost.

Randy Ellsworth, Teamcenter Architect, Applied CAx, LLC
NX 11.0.1.mp01 | SW 2016 | TcUA 11.2.3
Evaluating:AW 3.2

Re: Teamcenter - Replicating End-User Experience and generating metrics

Auto test against TC/NX can be set up with Eclipse plus free plugins Sikuli/AutoIT/TestNG. The test cases are strictly executed based on the screenshots taken previously and/or preset button names&ID's. This is working similarly as the HP UTF which is much expensive. 

 

Regarding the metrics you needed, maybe the time used in each test case can be taken as reference. And the time can be found in test report from TestNG. 

 

The challengefor such auto test against TC (same for HP UTF) is that it's quite hard to find corresponding button names & ID's inside TC, so you have to rely on preset screenshots. You have to re-take the screenshots if there are some updates in UI and it's time-consuming to do so. Running in remote client also has stability issue due to network delay.

 

Hope above helps.

 

br, Ethan

e.chen@live.com

 

 

Re: Teamcenter - Replicating End-User Experience and generating metrics

Thanks Randy for your response. I do understand we are limited in what we can do in our current version. We are working on our first install of TC 11, but that's entire different discussion.

 

We do temporarily use the following. We don't use them too often, and yes, the logs get big and can be difficult to read.

set TC_JOURNALLING=ON
set API_JOURNAL=FULL

set TC_TRACEBACK=ON

set TC_POM_JOURNALLING=N

set TC_JOURNAL_LINE_LIMIT=0

set TC_JOURNAL_MODULES=ALL

set TC_SQL_DEBUG=BJPT

set TC_JOURNAL = FULL

 

The only one from your list we seemed to not have used is, "TC_SLOW_SQL=3.0". I will look into this, in our test environment, when we enable these switches again.

We are currently investigating building some automation that will cycle through syslogs, hopefully to generate a report on what our users are opening and the times associated.

 

Working with our Siemens onsite, we have added indexes, peridoically.

 

 

 

 

Re: Teamcenter - Replicating End-User Experience and generating metrics

Hi there Boar,

 

First I think you should start with FMS Server in order to understand server connections and performance. I had same problem with TC back in day with Catia. Everything was seems fine but allocated RAM was the problem even though it has 64 gig. As I investigated deep it was VMWare or any VM provider's utility problem. After sorting about the allocated RAM we manage to use TC-Catia with full functionality. 

 

The software is here: https://technet.microsoft.com/en-us/sysinternals/rammap.aspx

 

Kind Regards,

 

Yagiz