Use primary data sources
Is it typically better to use primary data sources in quantitative research? Why or why not? When, if ever, would secondary data sources be preferred? Explain.
Sample Solution
When it comes to quantitative research, the use of primary data sources is typically more reliable than secondary sources. This is because primary data consists of information that has been collected directly from various sources such as surveys or interviews which provides researchers with a more accurate and up-to-date picture of a particular situation (Woodside & Trapp, 2017). Additionally, since this data has not been previously analyzed by someone else – it tends to be more trustworthy for analysis purposes as there are fewer chances for bias in its collection.
On the other hand, secondary data consists of information that was gathered during past studies or published materials which can be outdated or irrelevant depending on the context in which it was initially collected. Moreover, due to its pre-existing nature this type of data may have certain discrepancies or inaccuracies when compared to recent trends or developments that could affect how accurately results are interpreted (Kumar & Reddiar 2018). Therefore while both types of data can serve useful purposes during an investigation - relying solely on secondary sources should be avoided as much as possible when conducting quantitative research.
regards to the osmosis of pieces into lumps. Mill operator recognizes pieces and lumps of data, the differentiation being that a piece is comprised of various pieces of data. It is fascinating to take note of that while there is a limited ability to recall lumps of data, how much pieces in every one of those lumps can change broadly (Miller, 1956). Anyway it's anything but a straightforward instance of having the memorable option huge pieces right away, somewhat that as each piece turns out to be more natural, it very well may be acclimatized into a lump, which is then recollected itself. Recoding is the interaction by which individual pieces are 'recoded' and allocated to lumps. Consequently the ends that can be drawn from Miller's unique work is that, while there is an acknowledged breaking point to the quantity of pieces of data that can be put away in prompt (present moment) memory, how much data inside every one of those lumps can be very high, without unfavorably influencing the review of similar number