Monday, March 26, 2012
Gathering requirements for SQL Reporting Services
I am assigned the task of gathering requirements from business users
for implementing reports using SQL Server Reporting Services. Some of
the questions I have in mind are:
1. What is the purpose of the report?
2. How many users would be using it?
3. Where is the data coming from?
4. What delivery method is required?
5. What is the layout of the report?
What other information would I need before jumping into setting up the
framework and developing the reports?In addition to your list. you can find out.
1. Complexity (like calculated fileds, charts etc) of the reports
2. Whether the reports are to be shown seperate or should be a part of the
program (This question is for Report viewer)
3. The setup of the hardware servers ( this question is for checking the
performance)
4. Licensing issue should be in your checklist as well.
Amarnath
"sandeshmeda@.gmail.com" wrote:
> Hello,
> I am assigned the task of gathering requirements from business users
> for implementing reports using SQL Server Reporting Services. Some of
> the questions I have in mind are:
> 1. What is the purpose of the report?
> 2. How many users would be using it?
> 3. Where is the data coming from?
> 4. What delivery method is required?
> 5. What is the layout of the report?
> What other information would I need before jumping into setting up the
> framework and developing the reports?
>sql
Gathering information about space in a database using transact sq
I want to gather information about the total space, space used and some
others
statistics like the task pad in the enterprise manager, but using a transact
sql script. I didn't find system tables or useful procedures to do that , the
sp_spaceused give some information but I would like to made some statistics
with this values,
any ideas'
thanks!Hi
Run Profiler and Open up taskpad in EM, and see what EM executes agaisnt the
DB. This gives you a good idea of what they are using.
Regards
Mike
"Her" wrote:
> Hi!,
> I want to gather information about the total space, space used and some
> others
> statistics like the task pad in the enterprise manager, but using a transact
> sql script. I didn't find system tables or useful procedures to do that , the
> sp_spaceused give some information but I would like to made some statistics
> with this values,
> any ideas'
> thanks!
>
Gathering information about space in a database using transact sq
I want to gather information about the total space, space used and some
others
statistics like the task pad in the enterprise manager, but using a transact
sql script. I didn't find system tables or useful procedures to do that , th
e
sp_spaceused give some information but I would like to made some statistics
with this values,
any ideas'
thanks!Hi
Run Profiler and Open up taskpad in EM, and see what EM executes agaisnt the
DB. This gives you a good idea of what they are using.
Regards
Mike
"Her" wrote:
> Hi!,
> I want to gather information about the total space, space used and some
> others
> statistics like the task pad in the enterprise manager, but using a transa
ct
> sql script. I didn't find system tables or useful procedures to do that ,
the
> sp_spaceused give some information but I would like to made some statistic
s
> with this values,
> any ideas'
> thanks!
>
Gathering information about space in a database using transact sq
I want to gather information about the total space, space used and some
others
statistics like the task pad in the enterprise manager, but using a transact
sql script. I didn't find system tables or useful procedures to do that , the
sp_spaceused give some information but I would like to made some statistics
with this values,
any ideas?
thanks!
Hi
Run Profiler and Open up taskpad in EM, and see what EM executes agaisnt the
DB. This gives you a good idea of what they are using.
Regards
Mike
"Her" wrote:
> Hi!,
> I want to gather information about the total space, space used and some
> others
> statistics like the task pad in the enterprise manager, but using a transact
> sql script. I didn't find system tables or useful procedures to do that , the
> sp_spaceused give some information but I would like to made some statistics
> with this values,
> any ideas?
> thanks!
>
Friday, March 23, 2012
Fuzzy Lookup task - expressions for MatchIndexName and ReferenceTableName
I beleive Fuzzy Lookup was built before we had expressionable properties in data flows.
You may consider requesting this feature on the connect site.
Thanks,
-Bob
Fuzzy Lookup Error
Hi
I get the following error when I use Fuzzy Lookup in a Data Flow task with TransactionOption property set to “Required”
[Fuzzy Lookup [61]] Error: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Cannot create new connection because in manual or distributed transaction mode.".
When I Change the TransactionProperty to “Supported” it works fine.
I need the property set to Required for it does an undo in the event of a failure.
Any ideas on how to get the Fuzzy Lookup to work
Are you executing the package containing the data flow task from a parent package? I have seen an issue where the OLE DB connection does not properly defect from the transaction, and throws an error like you're seeing here. The work around was to execute the child package out of process.
~Matt
|||i dont have any child packages..|||Set Required on the package container, and Supported on all child containers. Then they will automatically enlist with the transaction from the package container.
Why do you need a transaction around a lookup?
sqlWednesday, March 21, 2012
Fuzzy grouping matching nulls to empty strings/spaces
Will the fuzzy grouping task match a null value to an empty string (or spaces)? I've got 5 columns I'm matching on, and one of them may be null for certain rows but an empty string for others. Given the 4 other columns may match, will this difference stop similar columns being grouped together?
(Someone's modified my grouped data since it was deduped, which takes a while, and I'm hoping for a quick answer on this).
Thanks in advance.
Ben
Not sure, but building a quick package and adding a data viewer should give you the answer.I would hope that a NULL wouldn't match to spaces...
Couldn't you run an update query to "fix" the data?|||
Computer says "no"
<cough>
No duplicate matching between null and empty string/spaces, as you suspected. Thanks for the idea of the quick test - seems obvious in retrospect, if only it had been beforehand Unfortunately I can't fix the data, other than rerunning the grouping task, but good to know the source of the problem - thanks Phil!
Fuzzy Grouping error
I continue to receive
Error: 0xC0047022 at Data Flow Task, Fuzzy Grouping Inner Data Flow : DTS.Pipeline: The ProcessInput method on component "Fuzzy Lookup" (97) failed with error code 0x8007000E. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
Error: 0xC0047021 at Data Flow Task, Fuzzy Grouping Inner Data Flow : DTS.Pipeline: Thread "WorkThread0" has exited with error code 0x8007000E.
Error: 0xC02020C4 at Data Flow Task, Fuzzy Grouping Inner Data Flow : OLE DB Source [1]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
Error: 0xC0047038 at Data Flow Task, Fuzzy Grouping Inner Data Flow : DTS.Pipeline: The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Error: 0xC0047039 at Data Flow Task, Fuzzy Grouping Inner Data Flow : DTS.Pipeline: Thread "WorkThread1" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
Error: 0xC0047021 at Data Flow Task, Fuzzy Grouping Inner Data Flow : DTS.Pipeline: Thread "SourceThread0" has exited with error code 0xC0047038.
Error: 0xC0047021 at Data Flow Task, Fuzzy Grouping Inner Data Flow : DTS.Pipeline: Thread "WorkThread1" has exited with error code 0xC0047039.
Error: 0xC020828A at Data Flow Task, Fuzzy Grouping [130]: A Fuzzy Grouping transformation pipeline error occurred and returned error code 0x8000FFFF: "An unexpected error occurred.".
Information: 0x402090DF at Data Flow Task, Matches [875]: The final commit for the data insertion has started.
Information: 0x402090E0 at Data Flow Task, Matches [875]: The final commit for the data insertion has ended.
Information: 0x40043008 at Data Flow Task, DTS.Pipeline: Post Execute phase is beginning.
Information: 0x40043009 at Data Flow Task, DTS.Pipeline: Cleanup phase is beginning.
Information: 0x4004300B at Data Flow Task, DTS.Pipeline: "component "Matches" (875)" wrote 1572785 rows.
Warning: 0x80019002 at Package: The Execution method succeeded, but the number of errors raised (8) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "Package.dtsx" finished: Failure.
I have set the maximumerrorcount =10000 on the data flow task and still get the same error. Server is a dual 3.6ghz with 3.5gb ram AWE is off
Any ideas?Brian,
Increasing the value of MaximumErrorCount will not cause the error to go away. If you set MaximumErrorCount on package (not data flow task) to any integer larger than 8, package will finish successfully despite of the error.
That is, instead of
Warning: 0x80019002 at Package: The Execution method succeeded, but the number of errors raised (8) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "Package.dtsx" finished: Failure.
It will return:
SSIS package "Package.dtsx" finished: Success.
The error in Fuzzy Lookup is still there. Can you shed more details about your package? How is Fuzzy Lookup transform configured? It will be very helpful if you can share your package.
|||Its actually a fuzzy grouping, not a lookup. I tried to put the XML for the project in this reply, but it is too big, any ideas on how I can send you the package?
Its pretty simple: I have a OLE DB Source with 7 fields selected, ID, First Name, Last Name, Mother Maiden, Birthdate, Gender and SSN.
I pipe that to the fuzzy grouping where all but the ID are using fuzzy logic. ID is just a pass through. I have a minimum match of .65.
I pipe the results back to the OLEDB datasource that is a new table.
Pretty straight forward, with a couple weights on gender, dob, ssn and last name.
This worked with 45K records, but so far has not worked with 1.5 Million.
I turned on AWE and re-ran which also failed - same error
This package usually runs for 7 hours before failing.|||The current implementation of FuzzyGrouping invokes FuzzyLookup. From the error you post, it seems FuzzyLookup is causing trouble.
Please send the package to Runying.Mao at microsoft dot com
Could you please send some sample data besides the package?|||I'm having a similar problem. I'm trying to achieve a Fuzzy Lookup with a source table of 650000 records (people) and a reference table of 24 million records (on two fields: social security number and complete name). When I use about one million records rather than 24 million in the reference table everything works fine. Similarly when I reduce the rows in the source table and maintain size of the reference table everything works ok (but I still get in all cases warnings about low virtual memory).
Here's my output (when trying the full job):
SSIS package "Package.dtsx" starting.
Information: 0x4004300A at Data Flow Task, DTS.Pipeline: Validation phase is beginning.
Information: 0x4004300A at Data Flow Task, DTS.Pipeline: Validation phase is beginning.
Information: 0x40043006 at Data Flow Task, DTS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at Data Flow Task, DTS.Pipeline: Pre-Execute phase is beginning.
Information: 0x4004300C at Data Flow Task, DTS.Pipeline: Execute phase is beginning.
Warning: 0x8000FFFF at Data Flow Task, Fuzzy Lookup [15]: Catastrophic failure
Warning: 0x8000FFFF at Data Flow Task, Fuzzy Lookup [15]: Catastrophic failure
Warning: 0x800470E9 at Data Flow Task, DTS.Pipeline: A call to the ProcessInput method for input 16 on component "Fuzzy Lookup" (15) unexpectedly kept a reference to the buffer it was passed. The refcount on that buffer was 2 before the call, and 1 after the call returned.
Error: 0xC0047022 at Data Flow Task, DTS.Pipeline: The ProcessInput method on component "Fuzzy Lookup" (15) failed with error code 0x8000FFFF. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
Error: 0xC02020C4 at Data Flow Task, OLE DB Source [1]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
Error: 0xC0047038 at Data Flow Task, DTS.Pipeline: The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "SourceThread0" has exited with error code 0xC0047038.
Error: 0xC0047039 at Data Flow Task, DTS.Pipeline: Thread "WorkThread1" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "WorkThread0" has exited with error code 0x8000FFFF.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "WorkThread1" has exited with error code 0xC0047039.
Information: 0x40043008 at Data Flow Task, DTS.Pipeline: Post Execute phase is beginning.
Information: 0x402090DF at Data Flow Task, OLE DB Destination [35]: The final commit for the data insertion has started.
Information: 0x402090E0 at Data Flow Task, OLE DB Destination [35]: The final commit for the data insertion has ended.
Information: 0x40043009 at Data Flow Task, DTS.Pipeline: Cleanup phase is beginning.
Information: 0x4004300B at Data Flow Task, DTS.Pipeline: "component "OLE DB Destination" (35)" wrote 0 rows.
Task failed: Data Flow Task
Warning: 0x80019002 at Package: The Execution method succeeded, but the number of errors raised (7) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "Package.dtsx" finished: Failure.
Please any help will be appreciated.
Regards and thanks.
GV
german @. NO amepla . SPAM com|||
Runying has been able to narrow down this problem and identified the issue.
A possible workaround is to ensure that the BufferTempStoragePath property of your data flow task is set to a drive with large enough empty space.
Thanks
Donald
Information: 0x4004800C at Data Flow Task, DTS.Pipeline: The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 6372 buffers were considered and 6372 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked.
It is clear to me now that when the reference table (for fuzzy lookup) is huge the package needs a lot of memory for building the fuzzy index. If you are using the same reference table all the time and the data there is rather static you should consider saving the index for future use which will save you a lot of time when running the package again. In my case a fuzzy lookup with a source table of 650000 rows and a reference table of 24 million rows took 72 hours and had to be done in 3 steps of aprox. 220000 source records otherwise the package always failed.
(P4 HT 2.6Ghz, 2Gb RAM, 160Gb SATA HD and lots of coffee =)
Regards,
GV