Human Generated Data

Title

Untitled (couple departing, Chamberlain, South Dakota)

Date

1948, printed later

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.934

Human Generated Data

Title

Untitled (couple departing, Chamberlain, South Dakota)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1948, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.2
Human 99.2
Antique Car 99.1
Vehicle 99.1
Automobile 99.1
Transportation 99.1
Car 97.7
Helmet 92.5
Clothing 92.5
Apparel 92.5
Hot Rod 92
Tire 88.7
Model T 84.6
Machine 83.1
Spoke 80.2
Car Wheel 77.6
Sports Car 74.9
Wheel 66.2
Coupe 65.4
Outdoors 61.8
Tarmac 59.7
Asphalt 59.7
Officer 55
Military 55
Military Uniform 55

Imagga
created on 2022-01-23

car 100
motor vehicle 55.1
vehicle 49.2
sedan 46.5
automobile 45.9
auto 45.9
transportation 41.2
parking meter 40.5
wheel 34.1
transport 33.8
timer 32.1
road 31.6
drive 26.5
timepiece 24.2
truck 20.9
cars 19.6
motor 19.4
speed 19.2
luxury 18.9
self-propelled vehicle 18.7
tire 18.2
wheeled vehicle 17.4
driving 17.4
street 16.6
measuring instrument 16.2
engine 15.4
traffic 15.2
limousine 15
bumper 14.7
travel 14.1
fast 14
mirror 13.9
expensive 13.4
sky 13.4
vintage 13.2
style 12.6
lamp 12.5
modern 11.9
sport 11.5
light 11.4
urban 11.4
sports 11.1
city 10.8
hood 10.8
wheels 10.7
new 10.5
old 10.4
shiny 10.3
seat 10.2
classic 10.2
design 10.1
power 10.1
parked 9.9
parking 9.8
pickup 9.7
chrome 9.4
black 9
metal 8.8
silver 8.8
antique 8.7
door 8.6
business 8.5
car mirror 8.2
technology 8.2
wealth 8.1
automobiles 7.9
headlight 7.9
driver 7.8
instrument 7.6
show 7.6
window 7.3
reflection 7.3
people 7.3
grille 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

tree 99.8
land vehicle 96.6
outdoor 95.9
vehicle 94.9
text 93.1
car 91.3
wheel 86.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-53
Gender Female, 86.8%
Happy 100%
Surprised 0%
Disgusted 0%
Confused 0%
Calm 0%
Angry 0%
Fear 0%
Sad 0%

AWS Rekognition

Age 37-45
Gender Female, 100%
Happy 99.6%
Angry 0.2%
Disgusted 0.1%
Sad 0%
Fear 0%
Surprised 0%
Confused 0%
Calm 0%

Microsoft Cognitive Services

Age 50
Gender Male

Microsoft Cognitive Services

Age 44
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Car 97.7%
Helmet 92.5%
Wheel 66.2%

Captions

Microsoft

a person standing in front of a car 77.9%
a man and a woman standing in front of a car 55.6%
a person standing in front of a car 55.5%

Text analysis

Amazon

FIGE
NS
CAB
S CAB NS
57%
S
57% 1889 78)
78)
1889
COMPANYS

Google

S
CAB
NS
FIGE S CAB NS
FIGE