Quantcast
Channel: SCN: Message List
Viewing all 8798 articles
Browse latest View live

Re: send report to user email based on table

$
0
0

But this would require them to use the CMC\? Actually this would be helpful for me.

 

I may be doing this wrong, but whenever they add someone. I have to pause the report, add the user. I get it wrong. So I end up making a brand new schedule.


Re: EMIGALL object for updating not creating object instances

$
0
0

Thank you very much Daniel.  I understand what you are saying about migration object: CONNOBJ. For a migration object like INST_MGMT, where in the documentation it does not clearly specify what object to use to modify data, what can I do in this situation?   Writing a BDC from LSMW would be easy but my boss wanted to use EMIGALL for everything.

Re: send report to user email based on table

$
0
0

Should be no issue to achieve

The whole BW flow is purely optional, you can just do it with by uploading/maintaining an XLS on your BI systm and uses that as source for your publication-webi

Re: Use of Profoma invoie in SAP GTS

$
0
0

Ravi,

 

It does not have to be only Proforma Invoice to create export declaration, However depending on your business scenario.

 

You can clear export customs using proforma however commercial invoice is must at the time of clearing goods on the receiving side.

 

If you can generate commercial invoice by the time you file export declaration, you can absolutely use it however please check if that can happen in all cases otherwise you have to fall back on proforma.

 

 

 

Regarding the configuration:

 

ECC:

Check -SPRO -  SD - Foreign Trade Customs - SAP GTS plugin - Control Data Transfer - Configure Doc types  -Check the SD0C

 

Here you can can see which Billin doc types are configured.

 

 

Once you find them, Check the Feeder system doc types mapping in GTS.

 

Please do let me know if couldn't get there. Thanks.

 

 

 

Regards

Raj

Re: DYNP_VALUES_READ & DYNP_VALUES_UPDATE at Selection-Screen Output

$
0
0

Hi Kurt,

 

I am wondering about the same thing

 

cheers Otto

Re: envió de correos por SAP

$
0
0

Ok, entonces intenta de nuevo el envío pero desactiva la casilla que te comenté... Vas a poder ver el proceso de envío en la Bandeja de Mensajes. En la pestaña de Carpeta de salida vas a ver el mensaje en proceso de envío, si fue exitoso o falló, lo verás en la pestaña final.

Re: Dashboards: Issue with BEx Authorization Variable

$
0
0

Cristian,

 

Yes, I have created a duplicate of the authorization variable and removed the input values and used the duplicate in my queries. Hope this helps

 

Thanks

 

Abhi

Re: Regarding output (Print preview)

$
0
0

Moved from SAP ERP Sales and Distribution (SAP SD) to Output Management.  Have you checked the Determination Analysis?  If so, share here

 

 

G. Lakshmipathi


Re: False Error message in CUP "User search failed for given criteria. Try again"

$
0
0

I found the solution. The connector was missing for one of the systems. So the message was returning from the systems to which GRC was not able to reach. But the user was found in the remaining systems.

 

We added the new connector and the issue was resolved.

Re: Set Filter + isResultSetEmpty On Startup?

$
0
0

Tammy: Not

Not(DS_1.isResultSetEmpty)

Re: SAP BI 4.1 Compatibilty with BW 7.5

$
0
0

Hi Henry

 

After knowing of BI 4.1 SP6 will support BW 7.5 (SP1/SP2/SP3 or will we require an upgrade to BI 4.1 SPx or 4.2 SPx

 

I presume that 7.5 is a significant change and that the version of say BICs will be different to that on BW 7.4 hence why the PAM shows no support at present please?

 

Thanks Henry

 

Will

Re: Error on concatenating strings in calculated cloumns while creating Analytic Views

$
0
0

Hi Nithin

thanks for your help it worked 1 more thing If I am able to get the 2nd concatenated column in the new Line it would be best

 

Name :Rahul

Surname: Khanna

 

Regards

Rahul

Re: Frequency of record count updates in DataBridge 10 vs DataBridge 7.5

$
0
0

Hi Steve

 

I would assume you'd decided DataBridge was the optimal solution for the clients data feed import into PCM. I suppose SAP would say since they basically increased the largest theorectical OLAP cube possible in PCM they had to rationalise the updates which slow down the DataBridge routine.

 

Ok so the distribution of the source data volumes has a high standard deviation so any expected volume number is meaningless, alright so have they considered adding a simple layer of logic which can create source data files of 10,000 records each and then any left over records in a final load file.

 

Since we're talking statistical distribution then the probabilistic distribution for a failure of data load trails of the same average DQ is increased for N records as N tends to a very large number. In other words your client should be more reassured when N is at the small end of that distribution of data set record numbers regardless of anything else.

 

This is simple ETL if you have a good ETL solution and even if you don't have that a simple text splitter windows utility will do that for you, run automatically on the source data file. A simple bit of recursive windows batch file script can re-run files DataBridge import and these can all be neatly and simply packaged in a console routine set to send an alert after each file is successfully loaded.

 

Data loader is already optimised for this sort of recursive loading but since you didn't mention it in your last reply I guess it is off the table? Anyway it's an option and good with high volume records and you can make it as interactive as you like with feedback such as number records available to load and info on records loaded, even tells you what kind of load error has occurred before you load into PP tables (I said this before) otherwise it will load them into the PP tables like DataBridge does in one step.

 

Could I be so bold as to say the real requirement here might be pre-validation of the records before the load is completed to prevent or minimise load failure. Reassurance that the last N records have been loaded won't stop the error if there is going to be one in the N+1 record.

 

Well regardless reassurance or pre-validation of data is the requirement then it has a cost. Question is are they willing to pay?

 

The subtle point here is that reassurance can be after the fact in the shape of a DataBridge trace log giving the number of successful records imported and the client presumable can tie that up to the source data record number but if the client wants it during the process, the Data Loader is the way to go then,since they are adding a constraint to the process which doesn't improve it but data loader is designed at giving feedback than DataBridge.

 

Just out of interest what is the performance when your client loads 1 million records using DataBridge?

 

 

Regards

 

Michael

GRC10 - Rule Regenerations can change Rule ID (which is no longer assigned to an existing Mitigating Control)

$
0
0

Our application has GRC 10 SP21 installed.  We consistently have had problems with Mitigating Controls (which we maintain in Production) due to rule generations.  When updated Functions and Risks are transported from Dev > QA > Prod, the rules have to be generated in each instance.  Every time a regeneration is executed, the Rule IDs can and often change.  This new Rule ID is is not connected to the MC so I have to add the NEW Rule ID and the Role to the existing MC.   I have had to do this countless time and it is extremely time consuming.

​ 

​Has anyone encountered this situation before and know if there us a fix for this?​​

 

I'd appreciate any suggestions you could provide; thank you very much.

Re: Reading Workbooks on Bex Analyser

$
0
0

Hi Pha,

 

There are two ways to save the workbook,

  • One is saved in Favorite , which can be opened you THAT User.
  • The Other one will be based on Folder/Roles,then assign that Role to Set of Users.This is Global.

 

So if you want your colleague to see it then save it under a NEW FOLDER/ROLE and assign it him.

 

Regards

 

SVU


Re: SAP Anywhere

Re: SAP IQ - Day of the Week Function Issue

$
0
0

Hi Ade,

 

1- For a function to be executed by IQ, you have to add "From iq_dummy", otherwise it is executed by Sql Anywhere..

 

2- The IQ option is Date_First_Day_Of_Week NOT First_Day_Of_Week  :

 

set temporary option Date_First_Day_Of_Week=1;

Select datepart(dw,getdate()) from iq_dummy ;

 

As per IQ manuals, SAP IQ does not have the same constants or data type promotions as SAP SQL Anywhere, with which it shares a common user interface

 

Please see also KBA 2221708:

https://service.sap.com/sap/support/notes/2221708

 


Regards,

Tayeb.

Re: Auto creation of Pre-Declaration through Import Delivery

$
0
0

Rahul,

 

/SAPSLL/API_6800_CSD_SYNCH  - This is the API you use to create import declaration automatically.

 

I did use this to create import declarations automatically from both SAP and Non-SAP feeder systems.

 

 

 

Regards

Raj

Re: Using top-level aggregations in an XML view

$
0
0

As far as I see, the content of sap.ui.core.mvc.View is not a default one.

JsDoc Report - SAP UI development Toolkit for HTML5 - API Reference - sap.ui.core.mvc.View

 

On the other hand, the content of sap.m.Page was the default one. (default) was written here.

JsDoc Report - SAP UI development Toolkit for HTML5 - API Reference - sap.m.Page

 

 

And i just have tried the following one.

 

<core:Viewxmlns:core="sap.ui.core"xmlns:mvc="sap.ui.core.mvc"xmlns="sap.m"

  controllerName="test.main"xmlns:html="http://www.w3.org/1999/xhtml">

  <Pagetitle="Title"content="{/buttons}">

    <Buttontext="{text}"></Button>

  </Page>

</core:View>



  onInit: function() {

  var buttonData = {"buttons" : [{"text" : "button1"},{"text" : "button2"},{"text" : "button3"}]}

  var oModel = new sap.ui.model.json.JSONModel(buttonData);

  this.getView().setModel(oModel)

  },

 

 

Then, 3 buttons appear in the page and I guess this might be what you are trying to do.

 

 

Regards,

Makoto

custom function module for change log

$
0
0

Hi Abap Experts,

 

i need to create a FM to bring Change log of the table which i want to see.

i know there are a lot of table , t-code and Standard FM for this requirement. However i want to create a custom FM for that.

Input Parameter should be table Name , date and time.

 

please don't delete this discussion because i have searched a lot in SDN but i couldnt find.

i just Need a simple code which even i couldnt succeed.

 

please help !

 

Thanks in advance.

Erdem

Viewing all 8798 articles
Browse latest View live




Latest Images