Sunday, 29 November 2020

JAVA Threads

 The following information helps to understand the Thread Executor. The following exam show the Interfaces and classes hierarchy .


The following code helps to understand how to use the ThreadExecutor to create connections while starting the ATG nucleus component.

ThreadExecutor has three mains points

1)Define the ThreadExecutor by setting the minimum and maximum threads , time, timeUnits, LinkedBlockingQueue.

2)Define the Callable Object. (Callable is interface with call method, need to implement the actual logic (in our case creating connection object)

2)Invoke the submit method over the ThreadExecutor by passing the Callable Object

1)write the below logic in the doStart() method

  ThreadPoolExecutor  executor = new ThreadPoolExecutor(0, 5, 20000, TimeUnit.MILLISECONDS, new LinkedBlockingQueue<Runnable>());

setThreadExecutor(executor);


2)Implement the logic for Callable as follows

Callable<String> cuCallableObj=new Callable<String>() {

public String call() throws Exception {

HttpEntity entity = null;

String resString = null;

try {

HttpHost proxy = null;

CloseableHttpClient client = null;

HttpClientBuilder httpClientBuilder = HttpClientBuilder.create();

if (isEnableProxy()) {

proxy = new HttpHost(getProxyHost(), getProxyPort());

client = httpClientBuilder.setProxy(proxy).build();

} else {

client = httpClientBuilder.build();

}

HttpPost httpPost = new HttpPost(url);

httpPost.addHeader("Content-Type", "application/json");

httpPost.setEntity(new StringEntity(pJSON));

CloseableHttpResponse response = client.execute(httpPost);

if (response != null) {

entity = response.getEntity();

}

if (entity != null) {

resString = EntityUtils.toString(entity);

}

response.close();

} catch (Exception ex) {

vlogError(ex, "Exception occurred inside the getGlobalConnectionPool()");

}

return resString;

}

};

3)Then call the submit() method over ThreadExecutor as follows.


Future<String> featureresponseObject = getThreadExecutor().submit(cuCallableObj);

if (featureresponseObject != null) {

try {

try {

responseObject = featureresponseObject.get(2000000, TimeUnit.MILLISECONDS);

} catch (TimeoutException ex) {

vlogError(ex, "TimeoutException occurred inside the getGlobalConnectionPool()");

}

} catch (ExecutionException ex) {

vlogError(ex, "ExecutionException occurred inside the getGlobalConnectionPool()");

}

}












Tuesday, 15 September 2020

ATG 11.3.2 info and issues

 The following section explains issue and provide information about ATG 11.3.2 upgrading.

Info



Issues

issue 1)Even though promotions are created properly at BCC, those promotions are not working as expected.

Solution:In ATG11.3.2 , the following flag is introduced and as it is enabled by default .So the promotion is not working as expected.So we have to disable this flag to work as expected.

ItemPricingEngine.useRulesBasedPreValidation =false

Friday, 1 May 2020

ATG 11.1 Responsive information

How to get the static  content in responsive design means using Service?


TargetingForEach targetingForEach = (TargetingForEach) ServletUtil.getCurrentRequest().resolveName("/atg/targeting/TargetingForEach");
pRequest.setParameter("targeter", (DynamicContentTargeter) ServletUtil.getCurrentRequest().resolveName(targeterComponent));
if (targeterComponent.equalsIgnoreCase("Targetter compnent name referred from one of the atg Component")) {
pRequest.setParameter("fireViewItemEvent", false);
pRequest.setParameter("fireContentEvent", false);
pRequest.setParameter("fireContentTypeEvent", false);
}
targetingForEach.service(pRequest, pResponse);
return (String) pRequest.getParameter("element.contentData");

In the above code, the  targeterComponent is the required atg targeter component


Same functionality with small difference

TargetingForEach targetingForEach = (TargetingForEach) ServletUtil.getCurrentRequest().resolveName("/atg/targeting/TargetingForEach");
pRequest.setParameter("targeter", (DynamicContentTargeter) ServletUtil.getCurrentRequest().resolveName(targeterComponent));
pRequest.setParameter("elementName", pElementName);
if (targeterComponent.equalsIgnoreCase("Targetter compnent name referred from one of the atg Component")) {
pRequest.setParameter("fireViewItemEvent", false);
pRequest.setParameter("fireContentEvent", false);
pRequest.setParameter("fireContentTypeEvent", false);
}
targetingForEach.service(pRequest, pResponse);
return (String) pRequest.getLocalParameter(pElementName+".contentData");



Sunday, 19 April 2020

Invoking store procedure in ATG

The following explains how to call/execute store procedure in ATG .


1)First configure the store procedure name (pProcedureName is string parameter) at ATG component level.
xxxStoreProcedureQuery=call xxx_PROC(?, ?)
 
  Here xxx_PROC is the store procedure name at DB level.


2) get the connection.

private Connection getDBConnection() {
try {
return ((GSARepository)getOrderRepository()).getDataSource().getConnection();
} catch (SQLException exp) {
      vlogError("Exception Occurred while Getting the Connection",exp);
}
return null;
}


3)call store procedure as follows

  public String invokeStoreProcedure(String pSiteId, String pProcedureName) {
String sequenceNumber = EMPTY_STRING;
boolean rollback = true;
Connection jdbcConn  = null;
TransactionManager tm = null;
TransactionDemarcation td = null;
CallableStatement procStmt = null;

try {
tm = getTransactionManager();
td = new TransactionDemarcation();
td.begin(tm, TransactionDemarcation.REQUIRES_NEW);
jdbcConn = getConnection();
procStmt = jdbcConn.prepareCall(pProcedureName);
procStmt.setString(1, pSiteId);
procStmt.registerOutParameter(2, java.sql.Types.VARCHAR);
procStmt.executeUpdate();
sequenceNumber = procStmt.getString(2);
rollback = Boolean.FALSE;
} catch (Exception ex) {
if (isLoggingError()) {
logError("SQL Exception occurred while getting the connection object", ex);
}
} finally {
try {
if (td != null) {
td.end(rollback);
}
} catch (TransactionDemarcationException e) {
vlogError("TransactionDemarcationException while correcting the corrupted relationship", e);
}
closingConnection(null, null, jdbcConn,procStmt);
}
return sequenceNumber;
}


4)Then call wherever it is requried

  String requiredValue = invokeStoreProcedure(pSiteId, pProcedureName);

Saturday, 4 April 2020

ATG problems and Solutions


Problem 1)Two IP address mapping to same host name
It shows that two ip address are mapping same host name.

Solution:To resolve this we have to copy protocal.jar at applicaton server   like weblogic.

Problem: 2) Hot fix location in generated ear 

atglib\_atghome_slocallib

In this path we can copy our .class file

Problem 3) Getting the following error while starting the server.

/atg/scenario/ScenarioManager   Unable to combine messaging information from the process manager component /atg/scenario/ScenarioManager. The process manager has not been classified yet so it cannot be determined if global messages should be listened for. This error indicates a problem with component startup order - the /atg/scenario/ScenarioManager component has been started before the /atg/dynamo/messaging/MessagingManager component
/atg/dynamo/messaging/MessagingManager  An exception occurred while trying to parse XML definition file "XMLFile(/atg/dynamo/messaging/dynamoMessagingSystem.xml)"       java.net.MalformedURLException: unknown protocol: dynamosystemresource
/atg/dynamo/messaging/MessagingManager          at java.net.URL.<init>(URL.java:617)
/atg/dynamo/messaging/MessagingManager          at java.net.URL.<init>(URL.java:507)
/atg/dynamo/messaging/MessagingManager          at java.net.URL.<init>(URL.java:456)
/atg/dynamo/messaging/MessagingManager          at com.sun.org.apache.xerces.internal.impl.XMLEntityManager.setupCurrentEntity(XMLEntityManager.java:620)


Solution:
Copied the protocol.jar file into the created domain lib folder need to set as part of classpath.

SET CLASSPATH=.;D:\XX\weblogic\user_projects\domains\NA_domain\lib\protocol.jar






Monday, 23 December 2019

Spring Project Creation

Spring Project Creation

1)Project creation at spring Initializer

The following link is sued to create the projects where we can give java,spring version and dependencies like (Spring Web,Tools,H2,JPA, etc..,)
https://start.spring.io/
 i)Select Maven project(By default it is selected)
ii)Select the Language as Java
iii)Select the Spring Boot like 2.2.2 or 2.2.3 snap shot
iv)And enter the Group and artefact like

 Options also we can give
  Packaging : as jar by default it is selected.
  Java: Version can select as 8
Once we give all the information, we have to click on generate tab. So that it will generate project and down load into our local location.
2)Importing the project into STS and Run the application
  Once down load is completed then extract the zip with 7 zip then import in sts(Spring tool suite).
  Import by using right clicking on the Navigator or Package Explorer by clicking right mouse button and select the import button.
  Then Expand  Maven option and select the Existing maven project and select the Next button and Browse the path what we down loaded the project path or copy past the path
  Then click on Finish button.
  And verify project and verify the pom.xml file whether required dependencies are added or not.
  Once everything is fine select the pom.xml file and right click on this pom.xml file and select maven option and select Update project.
  This Update project will down load all the required jars via net for the defined dependencies.


Friday, 6 September 2019

GiftWrap Handling info

GiftWrap handling
I)How to create giftwrap commerce item.
  The following block of should be part of  synchronized blcok
 1) Get the skuId and product Id from component configuraiton.
 2)Create commerce item by passing giftwrapItemType,skuId,productId, qty  as parameters
 3)Call the Commerce Item as follows   ciManager.addAsSeparateItemToOrder(order, gWItem);
 4)Get the shipping group as ShippingGroup sg = (ShippingGroup) getOrder().getShippingGroups().get(0);
 5)Add item qty to shipping group as follows ciManager.addItemQuantityToShippingGroup(order, gWItem.getId(),sg.getId(), qty);
 6)Set qty to giftwrap commerce Item gWItem.setQuantity(qty);
 7)Get the extra parameters map as Map extraParams = createRepriceParameterMap();
 8)Reprice the order as follows
                String repriceOrderPricingOp = PricingConstants.OP_REPRICE_ORDER_TOTAL;
              runProcessRepriceOrder(repriceOrderPricingOp, order,
getUserPricingModels(), getUserLocale(), getProfile(),
extraParams);
9)Cal the CommerceItemManager.updateOrder(order);

Note: Already giftwrap Product and SKU is crated in BCC.

Thursday, 21 February 2019

Behavior about asset_version and sec_asset_version colomons of multi table at ATG PUB DB level.

Behavior about asset_version and sec_asset_version  colomons of multi table at PUB DB level.


Example table is dcs_cat_chldprd
The following are behavior of this columns.
1) When category is only updated, then asset_version value is increased (mapping to dcs_category).
2)When product is only updated, then sec_asset_version value is increased (mapping to dcs_product).
3)If both are changed in a single project, the following behavior occurs.
  a)First the latest version(increased version) of asset_version holds the last version of sec_asset_version (product).Even if we remove by backed system(DB), it won't cause any issue at BCC level.
  b)And also the next/updated  version of sec_asset_version  is mapped with latest version asset_version
4)The seq_number won't be updated along with asset_version   and sec_asset_version.
5)The seq_number is changed only when the order of the products changed at category level.
6)If seq_number order is changed/corrupted(0,1,3,4,...) instead of proper order (0,1,2,3...), BCC won't allow to change that particular category.

All the above steps are verified practically.

Thursday, 24 January 2019

GoogleFeedinfo_SITMAPFeedInfo



**************Google Feed Info********************

***************************************************
GoogleFeedScheduler.properties
/u01/oracle/atg/data/feed/outbound/googleFeed/conf/Liverpool/lp_conf.xml



1)html_template
2)lp_conf.xml
3)lp_xml_template.xml
4)searchterms.xml
5)staticpages.txt
6)urlconfig.xml

)lp_conf.xml


<QUERY_FIELD>product.hybridProducts</QUERY_FIELD>   
   <QUERY_FIELD>product.relProdSequence</QUERY_FIELD>
 
 
   files destination Info********************
   GoogleFeedWriter.properties
     googleFeedLocalDirectory=/u01/oracle/atg/data/feed/outbound/googleFeed   (from SIT)
 

************Site Map Info***************************
****************************************************


GenerateSitemapScheduler.properties

configurationFilePath = /u01/oracle/atg/data/Sitemap/conf/conf.xml

1)conf.xml
2)html_template.xml
3)searchterms.xml
4)staticpages.txt
5)urlconfig.xml
6)xml_template.XML



conf.xml

<QUERY_FIELD>product.hybridProducts</QUERY_FIELD>   
   <QUERY_FIELD>product.relProdSequence</QUERY_FIELD>

Monday, 21 January 2019

keytool_CRC | need to check duplicate

Linux Machine: Keytool should be from

/opt/jdk1.6.0_29/bin/keytool


1. /opt/jdk1.6.0_29/bin/keytool -genkey -alias jbosskey -keypass crchome -keyalg RSA -keystore crcSecurePages.keystore
Password: crchome

                                What is your first and last name?
                                [Unknown]:  Chain Reaction
                                What is the name of your organizational unit?
                                [Unknown]:  Cycles
                                What is the name of your organization?
                                [Unknown]:  CRC
                               What is the name of your City or Locality?
                                [Unknown]:  Doagh
                                What is the name of your State or Province?
                                [Unknown]:  Northern Ireland
                                What is the two-letter country code for this unit?
                                [Unknown]:  UK
                                Is CN=Chain Reaction, OU=Cycles, O=CRC, L=Doagh, ST=Northern Ireland, C=UK correct?
                                [no]:  Y                 

2.   /opt/jdk1.6.0_29/bin/keytool -list -keystore crcSecurePages.keystore
3./opt/jdk1.6.0_29/bin/keytool -export -alias jbosskey -keypass crchome -file crcSecurePages.crt -keystore crcSecurePages.keystore
4. /opt/jdk1.6.0_29/bin/keytool -import -alias jbosscert -keypass crchome -file crcSecurePages.crt -keystore crcSecurePages.keystore
            Do you still want to add it? [no]:  Y
5. /opt/jdk1.6.0_29/bin/keytool -list -keystore crcSecurePages.keystore
            Enter keystore password: crchome
6. Edit "<C:/yourServerLocation>/server/ crc_ca/deploy/jbossweb.sar/server.xml"
b.                                    Uncomment the section that begins with <Connector port="8443"
c.                                     At the end of the section (but still inside of it) add:
i.                                                  keystoreFile="<C:/yourServerLocation>/server/ crc_ca/conf/ crcSecurePages.keystore"
ii.                                                 keystorePass="crchome"
1.            7. Ensure that you start the server with: (For Ex: run -c crc_ca -b localhost or IPAddress -Djavax.net.ssl.trustStore=C:/CRC/jboss-eap-5.1/jboss-as/server/crc_ca/conf/crcSecurePages.keystore)
a.                                                -c crc_ca -b 0.0.0.0 -Djavax.net.ssl.trustStore="<C:/yourServerLocation>/server/crc_ca/conf/crcSecurePages.keystore"
b.                                                Where -c specfies your server type
c.                                                 Where -b is required to use the server as anything but localhost, with a server name if you only have 1 network card, with 0.0.0.0 if you                         have multiple network cards
d.                                                -Djavax.net.ssl.trustStore specifies the location of your truststore.
e.                                                In Windows you may place these parameters in a shortcut you use to execute run.bat.
f.                                                 In Unix you may place them in your startup script.

keyStoreInfo Configuration


1)need to configure in host file
156.45.80.38      1fs.ecomm.maritzdev.com

2)copy lib files in the following path

%JBOSS_HOME%\lib\endorsed
3)copy the server.xml  in the following location .  (to maintain keystore file)

 %JBOSS_HOME%\server\maritz_production\deploy\jbossweb.sar
4) copy test.keystore in  the following(appropriate or configured) location

%JBOSS_HOME%\server\maritz_production\conf

5)copy the  run.conf.bat at %JBOSS_HOME%\bin

secure_certificate_setup_Store

On Linux Machine: Keytool should be from

/opt/jdk1.6.0_29/bin/keytool


1. /opt/jdk1.6.0_29/bin/keytool -genkey -alias jbosskey -keypass crchome -keyalg RSA -keystore crcSecurePages.keystore
Password: crchome

                                What is your first and last name?
                                [Unknown]:  Chain Reaction
                                What is the name of your organizational unit?
                                [Unknown]:  Cycles
                                What is the name of your organization?
                                [Unknown]:  CRC
                                What is the name of your City or Locality?
                                [Unknown]:  Doagh
                                What is the name of your State or Province?
                                [Unknown]:  Northern Ireland
                                What is the two-letter country code for this unit?
                                [Unknown]:  UK
                                Is CN=Chain Reaction, OU=Cycles, O=CRC, L=Doagh, ST=Northern Ireland, C=UK correct?
                                [no]:  Y                 

2.   /opt/jdk1.6.0_29/bin/keytool -list -keystore crcSecurePages.keystore
3./opt/jdk1.6.0_29/bin/keytool -export -alias jbosskey -keypass crchome -file crcSecurePages.crt -keystore crcSecurePages.keystore
4. /opt/jdk1.6.0_29/bin/keytool -import -alias jbosscert -keypass crchome -file crcSecurePages.crt -keystore crcSecurePages.keystore
            Do you still want to add it? [no]:  Y
5. /opt/jdk1.6.0_29/bin/keytool -list -keystore crcSecurePages.keystore
            Enter keystore password: crchome
a. 6. Edt "<C:/yourServerLocation>/server/crc-production/deploy/jbossweb.sar/server.xml"
b.                         Uncomment the section that begins with <Connector port="8443"
c.                         At the end of the section (but still inside of it) add:
i.                                     keystoreFile="<C:/yourServerLocation>/server/ crc-production/conf/ crcSecurePages.keystore"
ii.                                     keystorePass="crchome"
1. 7. Ensure that you start the server with: (For Ex: run -c crc-production -b localhost or IPAddress -Djavax.net.ssl.trustStore=C:/CRC/jboss-eap-5.1/jboss-as/server/crc-production/conf/crcSecurePages.keystore)
a.                                     -c crc-production -b 0.0.0.0 -Djavax.net.ssl.trustStore="<C:/yourServerLocation>/server/crc-production/conf/crcSecurePages.keystore"
b.                                     Where -c specfies your server type
c.                                     Where -b is required to use the server as anything but localhost, with a server name if you only have 1 network card, with 0.0.0.0 if you                         have multiple network cards
d.                                     -Djavax.net.ssl.trustStore specifies the location of your truststore.
e.                                     In Windows you may place these parameters in a shortcut you use to execute run.bat.
f.                                     In Unix you may place them in your startup script.

ATG BCC queries

The following section specifies the different queries to fetch information.

1)Query return asset info(like category ) using input the asset id (like category ) and project id

select * from dcs_category where category_id='cat20070' and  WORKSPACE_ID =(select id from avm_devline where name =(select workspace from epub_project where project_id = 'prj350096'));

2)Query return project details using the input workspace id 



select * from epub_project where WORKSPACE = (select name from avm_devline where id ='64352');

3)fetching the asstets using projectname from particular table or custom table

select * from xx_product_dyn_attribute where WORKSPACE_ID in ((select id from avm_devline where name in (select WORKSPACE from epub_project where DISPLAY_NAME ='projectName')));



4) TO verify the bcc dployment log using project id

select * from epub_dep_log where dep_id in (select deployment_id from epub_deploy_proj where project_id ='2345')



5) fetching workspace id using projectId

select workspace_id from epub_prj_targt_ws where project_id ='12345';


6) fetching targetId using projectId

select target_id from epub_prj_targt_ws where project_id ='12345';

7)fetching status of project at specific target using projectId
select  status_code from epub_pr_tg_status where project_id='12345'


8) fetching  the creation time for project's snapshot using project id
select snapsht_creat_tm from epub_pr_tg_st_ts  where project_id='proj1234' and target_id='targ1234';

9)fetching the workspace status using projectName

select * from avm_workspace where ws_id in (select id from avm_devline where name in (select WORKSPACE from epub_project where DISPLAY_NAME ='projectName'));

10)fetching assets repositoryId and repositoryNames which are locked or not using projectName

select * from avm_asset_lock where workspace_id in (select id from avm_devline where name in (select WORKSPACE from epub_project where DISPLAY_NAME ='projectName'));


11)To identify the duplicate record in multi / intermediate table

select category_id, asset_version, sequence_num, count (*), max(child_prd_id), min(child_prd_id)
from pub.dcs_cat_chldprd
group by category_id,asset_version, SEQUENCE_NUM
having count(*) > 1 and (max(child_prd_id) != min(child_prd_id));








Monday, 14 January 2019

category corrupted

Category corruption solution

If we are not able to update category at BCC as child products corrupted, we have to follow the below steps.


Suppose any particular category is corrupted like cat1111 and unable to see the child products in BCC. we have  to verify whether this category is corrupted or not.


Step 1: We need to get approval from the Business team for this activity.  we need the approval for doing the deployment changes in BCC and Dyn admin changes to update in dyn admin.

Once we got approval from Business, we need to start creating the ticket  at  environment support team  tool (like SparkRed) to take a Schema dump. This is nothing but if we modify wrongly or something went wrong that time it will help us restore the old details in the production to avoid the data loss.

Step 2: The environment support team(Sparkred team) will follow the above step by step activity in step 3 is done. They will provide the export the data in CSV file format in the particular FTP location.

Now we have to get the BCC log and then process using some stand alone so that we can get the catId and version number.Again query the DB to return categoryId and version.

Then we have to compare the both standalone o/p and DB o/p.


The DB query is as follows.

select CATEGORY_ID,ASSET_VERSION from pub.dcs_category where category_id in ('cat1001','cat1002','cat3004','cat3004' ) and is_head='1';


Here, we need to check the DB record with the Standalone record whether the asset version value is more than stand-alone means no need to take action. those categories are working as expected but whenever DB value is less than or equal to DB value, we need to correct those categories. 


Example:

This category cat10001 contains asset version 90  for both (DB and standalone program) 
So, we need to correct this category.


Step 3: Meanwhile keep the Category(cat10001 ) child products without duplicate.if you found any duplicate child products in the category need to remove and keep original data with the latest asset version.


To fetch   Category with child products we can use the below Query:-


select * from pub.dcs_cat_chldprd where category_id = 'cat10001 ' and asset_version = '890

Now, we got the Child products for the particular category.


Step 4: Go to Merch instance Dyn admin and search the ProductCatalog-ver to follow the below steps:-

The below we provided the sample print items for reference.

It’s an important step and we need more analyze and update it step by step.

while printing this category(cat10001 ) in the Dyn admin with latest asset version:-

1.<add-item item-descriptor="category" id="cat10001:90">
               <set-property name="fixedChildProducts"><![CDATA[prod3456]]></set-property>
</add-item>
<print-item item-descriptor="category" id = "cat10001:90" />


2.<add-item item-descriptor="category" id="cat10001:90">
               <set-property name="fixedChildProducts">
                  <![CDATA[prod5677,prod878,prod3455

]]></set-property>
</add-item>
- updating existing definition of category:cat10001 :90



Currently, the asset version is 90 which was corrupted now we are getting the child products from this asset version 90 and validate if any duplicate data is available or not. if there is no duplicate we can keep the data.


<add-item item-descriptor="category" id="cat10001:90">
             <set-property name="fixedChildProducts"><![CDATA[]]></set-property>
</add-item>


Here, 90 is the latest asset version this will get once add this category in the BCC via Add to project this new asset version will create automatically.

And some time we are unable to add to project for the particular category means we have followed the below step.

We need to collect the category is_head 1 and do the below print items in the Dyn admin. here, we have 2 categories which have this issue.

<add-item item-descriptor="category" id="cat4567:56">
               <set-property name="fixedChildProducts"><![CDATA[]]></set-property>
</add-item>


<add-item item-descriptor="category" id="cat46777:67">
               <set-property name="fixedChildProducts"><![CDATA[]]></set-property>
</add-item>

Once we push these changes in dyn admin now, we add this category using add to project.now it will create the new asset version for this category.

Using the new asset version we can follow the below our Dyn admin changes. 

Once we completed the above step in the dyn admin we have to do our normal changes like the below 



<add-item item-descriptor="category" id="cat20046:314">
  <set-property name="fixedChildProducts"><![CDATA[prod345,prod567,prod567,..

]]></set-property></add-item>

once we add the item to the category we can deploy BCC.

Step 5: Now in this category we have loaded the child products. make sure we need to validate those products are coming in the BCC while previewing.


If not, delete this project and create the new project select the category(cat100002) and Add to the Project. and could see those child products available in the BCC preview .



If everything looks good we can deploy the BCC project. Once the deployment is completed successfully. Please validate for the particular category those child products are displaying properly(the above image).


Again environment maintain team (like Sparkred team) to rerun the below query and provide the Export data file in the CSC format. this nothing but we wants to confirm after doing this activity if any categories are corrupted or not.

If all looks good, we can inform to environment maintain team ( like  Sparkred team) this BCC cleanup activity has been completed successfully.

After the business needs to confirm if everything looks good. 


Friday, 2 November 2018

ATG BCC cleaning orphan assets and cleaning Versioned data

ATG BCC cleaning  orphan assets and cleaning Versioned data 

As there are many orphan  assets (products and SKU) and unwanted versioned content(purging not there earlier ), we got the  task from client to remove the all unwanted orphan skus/proudct
This we achieved in the following way.We called this as BCC baseline.

1)We took the the catA/CatB, then idenity the all orphan Assets( skus and prodyucts) and then removed at  catA/catB  through DB Query.
2)Identify GWP promotion SKU's & Product's for the current environment for excluding in the Orphan product sku.
3)Then create projects at BCC by adding all these orphand asstes(skus and proudcts in multiple projects), then execute the project(after business team approve).


Now Export and Import DB.

While doing,we have to follow the below steps carefullly.
1)We need to communite with team regarding the freeze preiod for BCC.
2)Need to turn off the outbound feed.
3)Business team should stop using BCC.
4)Export the BCC topology
5)Shunt down the BCC instance.
6)Backup the versionFileStore file folder from BCC  instance and delete the orginal versionFileStore file folder from BCC instance of PROD instance
7)Export the data dump of  PUB schema and CATA schema of PROD oracle DB
8)Import the data dump of  PUB and CATA schmea to SIT 2 DB(PUB ==> BASELINE_EXP_BCC_PROD  &  CATA ==> BASELINE_EXP_CATA_PROD)
9)Create Two Empty schema  with specific schema meta data only[DDL only]  one for PUB as "BASELINE_IMP_BCC_PROD", one for CATA as "BASELINE_IMP_CATA_PROD"  in the same SIT 2 DB
10)Export the data dump for only price and inventory tables like dcs_plfol_chld, dcs_child_fol_pl, dcs_gen_fol_pl, dcs_price_level, dcs_price_levels,dcs_complex_price,dcs_price,dcs_price_list,dcs_inventory,dcs_inv_atp from PROD core
11)Import the data dump of price and inventory tables like dcs_plfol_chld, dcs_child_fol_pl, dcs_gen_fol_pl, dcs_price_level, dcs_price_levels,dcs_complex_price,dcs_price,dcs_price_list,dcs_inventory,dcs_inv_atp into "BASELINE_ATGCORE" schema of SIT-2 DB
12)Update the "DISTRIBUTED_LOCK_TIMEOUT" value to 3600 seconds in SIT-2 DB and restart the SIT-2 DB instance for reflecting the updated value. This modification is required for full deployment purpose.
13)Audit the count of records for all user tables from "BASELINE_EXP_BCC_PROD" and "BASELINE_EXP_CATA_PROD" schema
14)Create an XLS and store the count of records from both schema's in the expected format(We will see later this format)
15)Export the insert table script for das_id_generator table from "BASELINE_EXP_BCC_PROD" schema
16) Make sure all BCC instance tables exists in the  "BASELINE_IMP_BCC_PROD" schema after the DBA create new schema with only meta data.
17) Make sure all BCC instance tables exists in the  "BASELINE_IMP_CATA_PROD" schema after the DBA create new schema with only meta data..
18)Execute the grant access SQL script in "BASELINE_IMP_CATA_PROD" schema to grant DML access of required tables to "BASELINE_EXP_BCC_PROD" schema
19)Execute the disable constraints SQL script in "BASELINE_IMP_CATA_PROD" schema
20)Import the insert table script for das_id_generator table data into "BASELINE_IMP_BCC_PROD" schema


Execute the Export/Import Process
1)Configure the BASELINE_EXP_BCC_PROD  & BASELINE_EXP_CATA_PROD schema's in  Build Box using the "MonitoredDataSource" & "FakeXADataSource" classes in the "cata_export" & "pub_export" server local config layer
2)Configure the BASELINE_IMP_BCC_PROD  & BASELINE_IMP_CATA_PROD schema's in  Build Box using the "MonitoredDataSource" & "FakeXADataSource" classes in the  "pub_import" server local config layer
3)Configure the CORE schema of the SIT 2 DB in "cata_export",  "pub_export" & "pub_import" server local config layer
4)Execute the Repository Export process as per the Approach/Release document using "cata_export" & "pub_export" server local config layer
5)Verify the exported Files
6)Execute the Import Process for all the repositories as per the Approach/Release document using "pub_import" server local config layer
7)Execute the product catalog manual check-in process using the "ProductCatalog_Manually_checkin.sql" script
8)Audit the count of records for all user tables from the newly imported data in "BASELINE_IMP_BCC_PROD" and "BASELINE_IMP_CATA_PROD" schema
9)Store the  new count of records from both "BASELINE_IMP_BCC_PROD" and "BASELINE_IMP_CATA_PROD" schema's in the expected format.  Shared to client for review purpose.
10)Create/update the datasource for "BASELINE_IMP_BCC_PROD" and "BASELINE_IMP_CATA_PROD" schema's in SIT weblogic console
11)Shut down the BCC instances of SIT2
12)Convert the existing BCC instance SwitchingDataSource_production and SwitchingDataSource_staging components from switching type to normal datasource by changing the class to atg.nucleus.JNDIReference and adding JNDI name through ATG-Data layer
13)Add the RL related hot fix in the BCC instance EAR and also add/update the RL related component changes in the BCC server local config layer
14)Start only SIT BCC instance and this instance must point to the datasource of  "BASELINE_IMP_BCC_PROD" and "BASELINE_IMP_CATA_PROD" schema's
15)Execute the RL through SIT-BCC instance dyn/admin

16)Validate the newly generated versionFileStore location and publishingFileRepository


Verify the DB Import in SIT 2

1)Shut down the instances of SIT2(eStore,BCC,Aux)
2)Convert the existing e-Store instance SwitchingDataSource component  from switching type to normal datasource by changing the class name and adding JNDI name through ATG-Data layer. It will point to "BASELINE_IMP_CATA_PROD"
3)Change the password of the datasource, if applicable
4)Make sure the weblogic datasources of SIT 2 BCC instance and e-store instance point to new imported schemas.
5)Bring Up the BCC server
6)Verify the BCC logs
7)Bring up the store and Aux servers
8)Verify the Store Aux Logs
9)Verify the eStore Instance
10)Configure the BCC agents
11)Perform basic validation of BCC deployment and also publish empty project
12)Perform BCC Full Deployment
13)Verify the BCC publishing of catalog, content  etc repositories assets after modification
14)verify the BCC deployment of config file system  assets like targeters, scenarios, user segement etc by either creating the new asset or updating the exist asset
15)Execute the enable constraints SQL script in "BASELINE_IMP_CATA_PROD" schema
16)Execute CMS in one of the agent
17)Perform Endeca Indexing in SIT-2 environment
18)Verify the Estore instance and mainly the browse and shop pages


Production CutOver
1)Copy the versionFile store to a predefined location
2)Export the data dump from newly imported "BASELINE_IMP_BCC_PROD" and "BASELINE_IMP_CATA_PROD" schema from SIT2
3)Import the exported data dump from "BASELINE_IMP_BCC_PROD" and "BASELINE_IMP_CATA_PROD" schema of  SIT2 schema into QA DB as PUB2,CATA2,CATB2,PRVCATA2,PRVCATB2
4)Delete the data from DMS related tables, E-PUB related tables,  DSS server ID tables
5)Bring down all the instances of PROD
6)Make the weblogic data source changes to point to the new schemas as above
7)Bring up the store,aux instances etc..
8)Bring up the BCC instances
9)Verify the logs
10)Import the BCC Topology
11)Perform Sample Deployment
12)Perform Stibo Deployment
13)Perform Indexing
14)Verify the Functionality/site
15)Users Testing


Export scripts/commands (from DB  to XML file)


The export scripts are executed via repository.It will generate the xml files and log txt files.
As part of the export commands, we have to provide the output file and log file names also.

And at each exort command level, the count query is mentioned for the primary table to validate the data it self.

CAT-A (10):
-----------
1) /atg/multisite/SiteRepository
2) /atg/seo/SEORepository
3) /atg/userprofiling/PersonalizationRepository
4) /atg/commerce/catalog/ProductCatalog
5) /atg/commerce/claimable/ClaimableRepository
6) /atg/commerce/locations/LocationRepository
7) /com/xx/giftregistry/repo/GiftRegistryRepository
8) /com/xx/newsLetter/NewsSubscriptionRepository
9) /com/xx/repo/BrandNumbersRepository
10) /com/xx/common/content/statictext/repo/ContentRepository



From PUB (6):
-------------
1) /atg/systemconfiguration/SystemConfigurationRepository
2) /atg/dynamo/security/AdminSqlRepository
3) /atg/portal/framework/PortalRepository
4) /atg/userprofiling/InternalProfileRepository
5) /atg/web/viewmapping/ViewMappingRepository
6) /atg/epub/file/PublishingFileRepository

LocationRepository :
--------------------
nohup startSQLRepository -m modules.Commerce -s  cata_export -encoding ISO-8859-1 -repository /atg/commerce/locations/LocationRepository -noTransaction -export all exported_location_dump.xml > exported_location_dump_log.txt &
Primary Table :
select count(*) from dcs_location

SEORepository :
---------------
nohup startSQLRepository -m modules.Commerce -s cata_export -encoding ISO-8859-1 -repository /atg/seo/SEORepository -noTransaction -export all exported_SEO_dump.xml > exported_SEO_dump_log.txt &
Primary Table :
select count(*) from das_seo_tag


ClaimableRepository:
---------------------
nohup startSQLRepository  -m modules.Commerce  -s  cata_export -encoding ISO-8859-1 -repository /atg/commerce/claimable/ClaimableRepository -noTransaction -export all exported_claimable_dump.xml  > exported_claimable_dump_log.txt &
Note :
------
Need to override ClaimableRepository.properties to point the SwitchingDataSource.
Primary Table :
select count(*) from dcspp_claimable
select count(*) from dcspp_cp_folder

ProductCatalog :
----------------
nohup startSQLRepository  -m modules.Commerce -s cata_export -encoding ISO-8859-1 -repository /atg/commerce/catalog/ProductCatalog -noTransaction -export all exported_productCatalog_dump.xml > exported_productCatalog_dump_log.txt &
Duration : 13 hours 15 mins
File Size : 7136205408 MB(7.136205408 GB)


GiftRegistryRepository :
------------------------
nohup startSQLRepository  -m modules.Commerce -s  cata_export -encoding ISO-8859-1 -repository /com/xx/giftregistry/repo/GiftRegistryRepository  -noTransaction -export all exported_giftregistry_dump.xml > exported_giftregistry_dump_log.txt &
Primary Table :
select count(*) from xx_event

NewsSubscriptionRepository :
-----------------------------
nohup startSQLRepository  -m modules.Commerce -s  cata_export -encoding ISO-8859-1 -repository /com/xx/newsLetter/NewsSubscriptionRepository -noTransaction -export all exported_newsSubscription_dump.xml > exported_newsSubscription_dump_log.txt &
Primary Table :
select count(*) from xx_NEWS_LETTER

BrandNumbersRepository :
-------------------------
nohup startSQLRepository  -m modules.Commerce -s  cata_export -encoding ISO-8859-1 -repository /com/xx/repo/BrandNumbersRepository -noTransaction -export all exported_brandnumbers_dump.xml > exported_brandnumbers_dump_log.txt &
Primary Table :
select count(*) from xx_BRAND_NUMBERS

ContentRepository :
--------------------
nohup startSQLRepository  -m modules.Commerce -s  cata_export -encoding ISO-8859-1 -repository /com/xx/common/content/statictext/repo/ContentRepository -noTransaction -export all exported_content_dump.xml > exported_content_dump_log.txt &
Primary Table :
select count(*) from xx_static_text 4964
select count(*) from xx_page 258
select count(*) from xx_site_big_static_text 611
select count(*) from xx_site_label_static_text 16896
select count(*) from xx_site_msg_static_text 2044

PersonalizationRepository :
---------------------------
nohup startSQLRepository  -m DPS  -s  cata_export -encoding  ISO-8859-1 -repository /atg/userprofiling/PersonalizationRepository -noTransaction -export all exported_personalization_dump.xml  > exported_personalization_dump_log.txt &
Note :
------
Need to override PersonalizationRepository.properties to point the SwitchingDataSource.
Primary Table :
select count(*) from dps_seg_list_folder
select count(*) from dps_seg_list

SiteRepository :
----------------
nohup startSQLRepository  -m modules.Commerce -s cata_export -encoding  ISO-8859-1 -repository /atg/multisite/SiteRepository/ -noTransaction -export all exported_site_dump.xml  > exported_site_dump_log.txt &
Primary Table :
select count(*) from site_configuration
select count(*) from site_group

SystemConfigurationRepository :
-------------------------------
nohup startSQLRepository  -m modules.CA -s  pub_export -encoding ISO-8859-1 -repository /atg/systemconfiguration/SystemConfigurationRepository  -noTransaction -export all exported_sys_conf_dump.xml > exported_sys_conf_dump_log.txt &
Primary Table :
select count(*) from das_sys_config

AdminSqlRepository :
--------------------
nohup startSQLRepository  -m modules.CA -s  pub_export -encoding ISO-8859-1 -repository /atg/dynamo/security/AdminSqlRepository  -noTransaction -export all exported_admin_sql_dump.xml > exported_admin_sql_dump_log.txt &
Primary Table :
select count(*) from das_account

PortalRepository :
-------------------
nohup startSQLRepository  -m modules.CA -s  pub_export -encoding ISO-8859-1 -repository /atg/portal/framework/PortalRepository  -noTransaction -export all exported_portal_dump.xml > exported_portal_dump_log.txt &
Note :
Need to modify Manifest.MF file under CA module  :
DafEar.Admin DCS-UI.Versioned BIZUI PubPortlet DCS-UI.SiteAdmin.Versioned SiteAdmin.Versioned DCS.Versioned DCS-UI DAF.Endeca.Index.Versioned DCS.Endeca.Index.Versioned DCS.Endeca.Index.SKUIndexing modules.PARALCommonModule modules.PARALModule modules.PARALMerchModule modules.Feed.Import  modules.Feed.Export modules.Commerce
Primary Table :
select count(*) from paf_folder
select count(*) from paf_device_outputs
select count(*) from paf_display_modes
select count(*) from paf_community
select count(*) from paf_template
select count(*) from paf_page_template
select count(*) from paf_page
select count(*) from paf_layout
select count(*) from paf_style
select count(*) from paf_col_palette
select count(*) from paf_title_template
select count(*) from paf_region_def
select count(*) from paf_region
select count(*) from paf_gear_modes
select count(*) from paf_gear_def
select count(*) from paf_gear
select count(*) from paf_gear_param
select count(*) from paf_base_comm_role
select count(*) from paf_comm_template
select count(*) from paf_ct_folder
select count(*) from paf_ct_page
select count(*) from paf_ct_region
select count(*) from paf_ct_gear
select count(*) from paf_ct_alt_gear

InternalProfileRepository :
----------------------------
nohup startSQLRepository  -m DPS.InternalUsers -s  pub_export -encoding  ISO-8859-1 -repository /atg/userprofiling/InternalProfileRepository -noTransaction -export all exported_internal_profile_dump.xml > exported_internal_profile_dump_log.txt &
Note :
Not exported Items:
-------------------
workbench
workbenchTile
workbenchReport
workbenchReportItem

Primary Table :
select count(*) from dpi_user
select count(*) from dpi_contact_info
select count(*) from dpi_mailing
select count(*) from dpi_mail_batch
select count(*) from dpi_mail_server
select count(*) from dpi_role
select count(*) from dpi_access_right
select count(*) from dpi_organization
select count(*) from dpi_folder
select count(*) from wb_workbench
select count(*) from wb_tile
select count(*) from wb_report
select count(*) from wb_report_item
select count(*) from wb_report_msg
select count(*) from dsi_profile_slot
select count(*) from dsi_ind_scenario
select count(*) from dsi_coll_scenario
select count(*) from dsi_scenario_info
select count(*) from dsi_scen_mig_info
select count(*) from dsi_template_info
select count(*) from dsi_coll_trans
select count(*) from dsi_ind_trans
select count(*) from dsi_deletion
select count(*) from dsi_migration
select count(*) from dsi_xref

ViewMappingRepository :
------------------------
nohup startSQLRepository  -m modules.CA -s  pub_export -encoding ISO-8859-1 -repository /atg/web/viewmapping/ViewMappingRepository -noTransaction -export all exported_viewMapping_dump.xml > exported_viewMapping_dump_log.txt &
Primary Table:
select count(*) from vmap_mode
select count(*) from vmap_fh
select count(*) from vmap_im
select count(*) from vmap_ivm
select count(*) from vmap_pvm
select count(*) from vmap_iv
select count(*) from vmap_pv
select count(*) from vmap_ivattrdef
select count(*) from vmap_pvattrdef
select count(*) from vmap_attrval


Import scripts/commands

The following repositories are to be imported which are export using above scripts in the XML file

CAT-A (10):
-----------
1) /atg/multisite/SiteRepository  (imported as part product catalog import )
2) /atg/seo/SEORepository
3) /atg/userprofiling/PersonalizationRepository
4) /atg/commerce/catalog/ProductCatalog
5) /atg/commerce/claimable/ClaimableRepository
6) /atg/commerce/locations/LocationRepository
7) /com/xx/giftregistry/repo/GiftRegistryRepository
8) /com/xx/newsLetter/NewsSubscriptionRepository
9) /com/xx/repo/BrandNumbersRepository
10) /com/xx/common/content/statictext/repo/ContentRepository


From PUB (6):
-------------
1) /atg/systemconfiguration/SystemConfigurationRepository
2) /atg/dynamo/security/AdminSqlRepository
3) /atg/portal/framework/PortalRepository
4) /atg/userprofiling/InternalProfileRepository
5) /atg/web/viewmapping/ViewMappingRepository
6) /atg/epub/file/PublishingFileRepository


Import Queries To PUB Schema :
----------------------------------

InternalProfileRepository :
---------------------------
nohup startSQLRepository -m DPS.InternalUsers -s  pub_import  -repository /atg/userprofiling/InternalProfileRepository -import  /opt/ATG/ATG10.2/home/bin/Exported_Files_On_1_20_2017/CorrectFiles/PUB/exported_internal_profile_dump.xml   > imported_internal_profile_log.txt &

ProductCatalog :
------------------
nohup startSQLImport -m modules.CA -s  pub_import  -repository  /atg/commerce/catalog/ProductCatalog -file  /opt/ATG/ATG10.2/home/bin/Exported_Files_On_1_20_2017/CorrectFiles/CAT-A/exported_productCatalog_dump.xml  -workspace ProductCatalogImport -nocheckin > imported_productCatalog_log.txt &
[info@2020-12-03 04:28:27.608] created new workspace: workspace:75800

[info@2020-12-03 04:28:27.614] parsing input data

CONTAINER:atg.adapter.gsa.sqlimport.ImportException; SOURCE:com.ctc.wstx.exc.WstxUnexpectedCharException: Illegal character ((CTRL-CHAR, code 26))
 at [row,col {unknown-source}]: [28036853,45]
        at atg.adapter.gsa.sqlimport.ImportParserImpl.parseFirsPass(ImportParserImpl.java:130)
        at atg.adapter.gsa.sqlimport.SQLImporter.doImport(SQLImporter.java:1992)
        at atg.adapter.gsa.sqlimport.SQLImporter.execute(SQLImporter.java:1353)
        at atg.adapter.gsa.sqlimport.SQLImporter.main(SQLImporter.java:1306)
Caused by: com.ctc.wstx.exc.WstxUnexpectedCharException: Illegal character ((CTRL-CHAR, code 26))
 at [row,col {unknown-source}]: [28036853,45]
        at com.ctc.wstx.sr.StreamScanner.throwInvalidSpace(StreamScanner.java:666)
        at com.ctc.wstx.sr.StreamScanner.throwInvalidSpace(StreamScanner.java:651)
        at com.ctc.wstx.sr.BasicStreamReader.readCDataPrimary(BasicStreamReader.java:4226)
        at com.ctc.wstx.sr.BasicStreamReader.nextFromTreeCommentOrCData(BasicStreamReader.java:3285)
        at com.ctc.wstx.sr.BasicStreamReader.nextFromTree(BasicStreamReader.java:2801)
        at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1065)
        at org.codehaus.stax2.ri.Stax2EventReaderImpl.nextEvent(Stax2EventReaderImpl.java:255)
        at atg.adapter.gsa.sqlimport.ImportParserImpl.parseFirsPass(ImportParserImpl.java:112)
        ... 3 more
Solution :
sed -ie 's/few/asd/g' hello.txt

Run below on increasing maxConnection @JTData component at server layer and also

Need to add Modules.CA manifest file with below
ATG-Required: DafEar.Admin DCS-UI.Versioned BIZUI PubPortlet DCS-UI.SiteAdmin.Versioned SiteAdmin.Versioned DCS.Versioned DCS-UI DAF.Endeca.Index.Versioned DCS.Endeca.Index.Versioned DCS.Endeca.Index.SKUIndexing modules.PARALCommonModule modules.PARALModule modules.PARALMerchModule modules.Feed.Import  modules.Feed.Export modules.Commerce


nohup startSQLImport -m modules.CA -s pub_import -repository /atg/commerce/catalog/ProductCatalog -file /opt/ATG/ATG10.2/home/bin/exported_productCatalog_dump_22_1_2017_withRemovedChars.xml -workspace ProductCatalogImport -nocheckin -batchSize 5000 > /opt/ATG/ATG10.2/home/bin/imported_productCatalog_log_siva1.txt &

@163 box

nohup startSQLImport -m modules.CA -s pub_import -repository /atg/commerce/catalog/ProductCatalog -file /opt/ATG/ATG10.2/home/bin/Apparao/exported_productCatalog_dump_22_1_2017_withRemovedChars.xml -workspace ProductCatalogImport1 -nocheckin -batchSize 5000 > /opt/ATG/ATG10.2/home/bin/imported_productCatalog_log_apparao_24_1.txt &


workspace :76800

LocationRepository :
---------------------
./startSQLImport -m modules.CA -s pub_import -repository /atg/commerce/locations/LocationRepository -file exported_location_dump.xml -workspace location_import_workspace
**** Error      Thu Dec 03 04:23:13 IST 2020    1606949593882   /       Unable to set configured property "/com/xx/feed/export/listener/IncrementalPriceItemIndexingListener.incrementalQueueRepository" atg.nucleus.ConfigurationException: Unable to resolve component /atg/search/repository/IncrementalItemQueueRepository
**** Error      Thu Dec 03 04:23:13 IST 2020    1606949593915   /       Unable to set configured property "/atg/search/SynchronizationInvoker.port" atg.nucleus.ConfigurationException: Unable to resolve component /atg/search/Constants.null
**** Error      Thu Dec 03 04:23:13 IST 2020    1606949593916   /       Unable to set configured property "/atg/search/SynchronizationInvoker.host" atg.nucleus.ConfigurationException: Unable to resolve component /atg/search/Constants.null
**** info       Thu Dec 03 04:23:13 IST 2020    1606949593967   /atg/epub/deployment/DeploymentManager  Resolving reference to /atg/deployment/DeploymentManager
[info@2020-12-03 04:23:14.089] created new workspace: workspace:75701

[info@2020-12-03 04:23:14.096] parsing input data

[info@2020-12-03 04:23:14.569] importing 184 items.

[info@2020-12-03 04:23:14.569] checking out versioned items in batch

[info@2020-12-03 04:23:15.599] performing add/update/removes

[info@2020-12-03 04:23:16.825] Phase 2 of 2: 100%

[info@2020-12-03 04:23:16.826] checkIn: workspace:75701

java.lang.AbstractMethodError: atg.adapter.version.VersionRepository.getVersionItemsInLine(Latg/versionmanager/DevelopmentLine;ZZZ)Ljava/util/Iterator;
        at atg.adapter.secure.GenericSecuredMutableVersionRepository.getVersionItemsInLine(GenericSecuredMutableVersionRepository.java:883)
        at atg.versionmanager.impl.DevelopmentLineRepositoryImpl.get(DevelopmentLineRepositoryImpl.java:547)
        at atg.versionmanager.impl.DevelopmentLineRepositoryImpl.getAllAssetVersions(DevelopmentLineRepositoryImpl.java:376)
        at atg.versionmanager.impl.DevelopmentLineRepositoryImpl.getAllAssetVersions(DevelopmentLineRepositoryImpl.java:393)
        at atg.versionmanager.impl.DevelopmentLineRepositoryImpl.getAllAssetVersions(DevelopmentLineRepositoryImpl.java:351)
        at atg.versionmanager.impl.DevelopmentLineRepositoryImpl.getAllAssetVersionsCount(DevelopmentLineRepositoryImpl.java:404)
        at atg.versionmanager.impl.WorkspaceRepositoryImpl.checkInAll(WorkspaceRepositoryImpl.java:1490)
        at atg.adapter.gsa.sqlimport.SQLImporter.checkIn(SQLImporter.java:2512)
        at atg.adapter.gsa.sqlimport.SQLImporter.doImport(SQLImporter.java:2043)
        at atg.adapter.gsa.sqlimport.SQLImporter.execute(SQLImporter.java:1353)
        at atg.adapter.gsa.sqlimport.SQLImporter.main(SQLImporter.java:1306)



Solution: Again done  @ 163 box. So issue.

only media-extranal item port :
nohup startSQLImport -m modules.CA -s  pub_import  -repository  /atg/commerce/catalog/ProductCatalog -file  /opt/ATG/ATG10.2/home/bin/exported_media_external_dump.xml  -workspace Only_MediaItem_Import -nocheckin > Only_MediaItem_Import.txt &

nohup startSQLImport -m modules.CA -s  pub_import  -repository  /atg/commerce/catalog/ProductCatalog -file  /opt/ATG/ATG10.2/home/bin/exported_media1_external_dump.xml  -workspace exported_media1_external_dump -nocheckin > exported_media1_external_dump.txt &

SEORepository :
---------------
nohup startSQLImport -m modules.CA -s pub_import -repository  /atg/seo/SEORepository -file  /opt/ATG/ATG10.2/home/bin/Exported_Files_On_1_20_2017/CorrectFiles/CAT-A/exported_SEO_dump.xml  -workspace SEO_Import > imported_SEO_log.txt &

log:

[info@2020-12-05 04:46:48.034] parsing input data

[info@2020-12-05 04:46:48.323] importing 239 items.

[info@2020-12-05 04:46:48.323] checking out versioned items in batch

[info@2020-12-05 04:46:49.338] performing add/update/removes

[info@2020-12-05 04:46:50.472] Phase 2 of 2: 100%

[info@2020-12-05 04:46:50.472] checkIn: workspace:76100

java.lang.AbstractMethodError: atg.adapter.version.VersionRepository.getVersionItemsInLine(Latg/versionmanager/DevelopmentLine;ZZZ)Ljava/util/Iterator;
        at atg.versionmanager.impl.DevelopmentLineRepositoryImpl.get(DevelopmentLineRepositoryImpl.java:547)
        at atg.versionmanager.impl.DevelopmentLineRepositoryImpl.getAllAssetVersions(DevelopmentLineRepositoryImpl.java:376)
        at atg.versionmanager.impl.DevelopmentLineRepositoryImpl.getAllAssetVersions(DevelopmentLineRepositoryImpl.java:393)
        at atg.versionmanager.impl.DevelopmentLineRepositoryImpl.getAllAssetVersions(DevelopmentLineRepositoryImpl.java:351)
        at atg.versionmanager.impl.DevelopmentLineRepositoryImpl.getAllAssetVersionsCount(DevelopmentLineRepositoryImpl.java:404)
        at atg.versionmanager.impl.WorkspaceRepositoryImpl.checkInAll(WorkspaceRepositoryImpl.java:1490)
        at atg.adapter.gsa.sqlimport.SQLImporter.checkIn(SQLImporter.java:2512)
        at atg.adapter.gsa.sqlimport.SQLImporter.doImport(SQLImporter.java:2043)
        at atg.adapter.gsa.sqlimport.SQLImporter.execute(SQLImporter.java:1353)
        at atg.adapter.gsa.sqlimport.SQLImporter.main(SQLImporter.java:1306)


Solution: Again run at 163 box. Assets are checked in.

SystemConfigurationRepository :
-------------------------------
nohup startSQLRepository -m modules.CA -s  pub_import  -repository /atg/systemconfiguration/SystemConfigurationRepository -import  /opt/ATG/ATG10.2/home/bin/Exported_Files_On_1_20_2017/CorrectFiles/PUB/exported_sys_conf_dump.xml   > imported_system_conf_log.txt &




AdminSqlRepository :
--------------------
nohup startSQLRepository -m modules.CA -s  pub_import  -repository /atg/dynamo/security/AdminSqlRepository -import  /opt/ATG/ATG10.2/home/bin/Exported_Files_On_1_20_2017/CorrectFiles/PUB/exported_admin_sql_dump.xml   > imported_AdminSql_log.txt &


PortalRepository :
------------------
nohup startSQLRepository -m modules.CA -s  pub_import  -repository /atg/portal/framework/PortalRepository -import  /opt/ATG/ATG10.2/home/bin/Exported_Files_On_1_20_2017/CorrectFiles/PUB/exported_portal_dump.xml   > imported_portal_log.txt &



PersonalizationRepository :version Repository, executed first as no checkin
---------------------------
nohup startSQLImport -m modules.CA -s pub_import -repository /atg/userprofiling/PersonalizationRepository -file  /opt/ATG/ATG10.2/home/bin/Exported_Files_On_1_20_2017/CorrectFiles/CAT-A/exported_personalization_dump.xml  -workspace Personalization_Import   > imported_personalization_log.txt &


ViewMappingRepository :
-----------------------
nohup startSQLRepository -m modules.CA -s  pub_import   -repository   /atg/web/viewmapping/ViewMappingRepository -import /opt/ATG/ATG10.2/home/bin/Exported_Files_On_1_20_2017/CorrectFiles/PUB/exported_viewMapping_dump.xml   > imported_viewMapping_log.txt &




ClaimableRepository :
---------------------
nohup startSQLImport -m modules.CA -s  pub_import  -repository    /atg/commerce/claimable/ClaimableRepository -file  /opt/ATG/ATG10.2/home/bin/Apparao/exported_claimable_dump.xml  -workspace Claimable_import > imported_Claimable_log_apparao.txt &


LocationRepository : agin run at 163 box
---------------------
nohup startSQLImport -m modules.CA -s pub_import -repository /atg/commerce/locations/LocationRepository -file /opt/ATG/ATG10.2/home/bin/Apparao/exported_location_dump.xml -workspace location_import_workspace > imported_location_log_apparao.txt &


GiftRegistryRepository :
------------------------
nohup startSQLImport -m modules.CA -s  pub_import  -repository    /com/xx/giftregistry/repo/GiftRegistryRepository  -file  /opt/ATG/ATG10.2/home/bin/Apparao/exported_giftregistry_dump.xml  -workspace GiftRegistry_import > imported_giftRegistry_log_apparao.txt &


NewsSubscriptionRepository :
-----------------------------
nohup startSQLImport -m modules.CA -s  pub_import  -repository     /com/xx/newsLetter/NewsSubscriptionRepository -file  /opt/ATG/ATG10.2/home/bin/Apparao/exported_newsSubscription_dump.xml  -workspace NewsSubscription_import  > imported_newsSubscription_log_apparao.txt &

BrandNumbersRepository :
------------------------
nohup startSQLImport -m modules.CA -s   pub_import -repository    /com/xx/repo/BrandNumbersRepository -file  /opt/ATG/ATG10.2/home/bin/Apparao/exported_brandnumbers_dump.xml  -workspace BrandNumbers_imported  > imported_brandnumber_log_apparao.txt &

ContentRepository :
-------------------
nohup startSQLImport -m modules.CA -s  pub_import  -repository  /com/xx/common/content/statictext/repo/ContentRepository -file  /opt/ATG/ATG10.2/home/bin/Apparao/exported_content_dump.xml  -workspace Content_imported > imported_content_log_apparao.txt &