469,330 Members | 1,317 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 469,330 developers. It's quick & easy.

How do you map Access fields and Excel columns before importing

547 512MB
I think it is called Import specification.
I usually import 3 Text fields ("IDNo" , "Name" and "Surname") from an Excel spreadsheet into an Access table called "Cyclist".
The fields may have different names in Excel ie

Excel ----------------- Access
IdentityNumber--------- IDNO
Firstname ------------ Name
Surname------------ Surname

The bottom line is that the fields in Excel may vary in name while in Access it stays the same.

When i press the Import button, i would like a "mapping" table or something similar to appear, and then i can manually link the correct Excel and Access fields.
Any ideas whether its possible and how it can be done?

I currently use this code to import from a specific directory. I still need to figure out the file open function for this (from Neopa), to avoid using a specific directory.
Expand|Select|Wrap|Line Numbers
  1. DoCmd.TransferSpreadsheet acImport, 10, "Cyclist", "c:\RTBU\AthleteData.xls", True, ""
It must not import duplicate "IDNo" data
Sep 8 '11 #1

✓ answered by NeoPa

That option's available for text file imports/exports too. It's only Excel where MS seem to have made a bit of a mess with their handling (as far as I can see at least). If the data contains field names though, simply specifying the field name wouldn't be enough, as that would then still try to import the field name value into the data which itself could cause problems.

Ultimately, it's possible to handle in code, but far more fiddly than it needs to be. A template file may well be the easiest way around it.

10 23952
NeoPa
32,181 Expert Mod 16PB
The flexibility of the interface for Excel importing/exporting is very limited.

One option is to import with the HasFieldNames parameter set as False, but that will cause problems if the name data doesn't fit in the field, as well as meaning you'd need to delete the first record from the imported data.

The other is to import into a completely new table then transfer the data across using a query. This would involve code to determine which field names had been used. You'd also want to remove the new table afterwards as it would have no further use.
Sep 8 '11 #2
neelsfer
547 512MB
Thx for response Neopa. My only other option i suppose is then to use a pre-formatted spreadsheet with correct titles, and then to copy/paste data in correct fields before importing.
Would have been nice to have an option like you have in Outlook, when data fields are mapped before importing.
Sep 8 '11 #3
neelsfer
547 512MB
NeoPa i just saw this mapping tool on the internet. But its not for free. This would be ideal.
http://www.4tops.com/excelimport.htm

** Edit **

Attached Images
File Type: jpg match.jpg (55.4 KB, 12852 views)
Sep 8 '11 #4
NeoPa
32,181 Expert Mod 16PB
That option's available for text file imports/exports too. It's only Excel where MS seem to have made a bit of a mess with their handling (as far as I can see at least). If the data contains field names though, simply specifying the field name wouldn't be enough, as that would then still try to import the field name value into the data which itself could cause problems.

Ultimately, it's possible to handle in code, but far more fiddly than it needs to be. A template file may well be the easiest way around it.
Sep 8 '11 #5
patjones
931 Expert 512MB
I just wanted to chip in here and mention that in the very project I'm working on right now, I went the route of importing the spreadsheet to a temporary table then INSERTing the data into the permanent table. It's not too messy...just a few lines of code and an INSERT query that you write once.

The business requirements of this project required me to go an extra step and write a function that checks the temporary table to make sure all the necessary columns are in place. It's possible that the user can import a spreadsheet completely unrelated to the task at hand. So the column check ensures that the right data is there prior to attempting the INSERT into the permanent table.

Pat
Sep 8 '11 #6
NeoPa
32,181 Expert Mod 16PB
I expect Neels would benefit from your sharing the details of that Pat - at least as far as they overlap with this.
Sep 8 '11 #7
neelsfer
547 512MB
Thx for input.
I have racetiming program. The race organisers use external companies to do the Online entries for them. A spreadsheet is then forwarded to the race organiser, with different column titles, to what i use when importing the data into my table.I need to simplify the process of importing.
How do you match the imported fields that may have different titles to your table fields?
I don't suppose you can assist me with an example of the code please?
Sep 8 '11 #8
NeoPa
32,181 Expert Mod 16PB
I just noticed something when looking at your other current thread Neels. One of those kick-myself moments.

You can use the Range parameter to skip the first row. This means you can ignore the field names within the worksheet and import the data into your predefined table as long as you know the format the data is expected in.
Sep 8 '11 #9
NeoPa
32,181 Expert Mod 16PB
Or maybe not!

I tried it in the code first and, quite bizarrely, when I specified not to use field names from first row, instead of using those from the destination table as I would have expected (It makes sense right?) it decided to assign each column a name of its own (F1, F2 & F3 in my test).

When I tried it manually and deselected use of field names in first row, it greyed out the option to import into an existing table.

With application logic like that this won't be as easy as perhaps it should be (That kick-myself moment just got a little better and turned into a kick-MS moment instead).
Sep 8 '11 #10
patjones
931 Expert 512MB
This is going to seem convoluted, but bear with me. The basic idea is based upon knowing ahead of time what the column headings will be. In my instance, the users are running reports in a mainframe system and dumping the results into Excel. They then import those spreadsheets into the Access database using the interface that I built for them.

There are three different spreadsheets that they could import, each with different columns. The user specifies by picking from a combo box what it is they are going to import: pay data, leave data, or time data. They then browse for the spreadsheet using a file picker dialog that I raise.

Now, the mainframe always names the columns for a particular report the same thing, so provided that the user doesn't edit the spreadsheet prior to the Access import, I can tell Access what column names to expect for a particular spreadsheet by defining them in a comma delimited string. For example, the spreadsheet with pay data has columns Employee Number, EARN_DATE, Event Type, Hours1 and Amount. So I define a string like this:

Expand|Select|Wrap|Line Numbers
  1. strReqFieldList = "Employee Number,EARN_DATE,Event Type,Hours1,Amount"

At this point we want to make sure that the temporary table actually contains these columns, because it's possible that the user browsed for and imported something other than a pay data spreadsheet, even though they indicated that they were trying to import a pay spreadsheet.

So I made a function that takes the temporary table name and the required columns string as arguments. Essentially, it loops through the required column list, making sure that each column is in fact in the imported data (error handling omitted for clarity):

Expand|Select|Wrap|Line Numbers
  1. Public Function MissingField(strTableName As String, Optional strReqFields As String) As String
  2.  
  3.    'Connect to the table that we want to check for missing columns
  4.     Dim db As Database
  5.     Dim ltabdef As DAO.TableDef
  6.  
  7.     Set db = CurrentDb()
  8.     Set ltabdef = db.TableDefs(strTableName)
  9.  
  10.    'Variable to hold a column definition
  11.     Dim lflddef As Field
  12.  
  13.    'Array to hold the required column names
  14.     Dim arrReqFields() As String
  15.     arrReqFields = Split(strReqFields, ",")
  16.  
  17.    'Loop counter
  18.     Dim M As Byte
  19.  
  20.     For M = 0 To UBound(arrReqFields)
  21.        'Start out assuming that the field name does not exist in the imported data
  22.         MissingField = arrReqFields(M)
  23.  
  24.        'Enumerate through the columns in the table and check to see if the column name in question, arrReqFields(M), is in the collection
  25.         For Each lflddef In ltabdef.Fields
  26.             If lflddef.Name = arrReqFields(M) Then MissingField = "": Exit For
  27.         Next lflddef
  28.  
  29.        'If we come out of the column name check with a non-empty string, we have a missing column...so exit the function.
  30.         If MissingField <> "" Then Exit For
  31.     Next M
  32.  
  33. End Function

If the function returns a zero length string, there are no columns missing - and so we know that the imported data contains what we're looking for. I wrote this function a while ago. In hindsight there is probably a simpler way to do the check using some crafty SQL. I haven't gotten to that yet, but one of these days for sure.

What this whole process ensures is that, at minimum, the columns that you need for your business process are present.
Sep 9 '11 #11

Post your reply

Sign in to post your reply or Sign up for a free account.

Similar topics

12 posts views Thread by D. Shane Fowlkes | last post: by
1 post views Thread by CARIGAR | last post: by
reply views Thread by zhoujie | last post: by
reply views Thread by suresh191 | last post: by
reply views Thread by Purva khokhar | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.