Human Generated Data

Title

Untitled (view of car carrying archbishop parading down street)

Date

1960

People

Artist: Claseman Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11020

Human Generated Data

Title

Untitled (view of car carrying archbishop parading down street)

People

Artist: Claseman Studio, American 20th century

Date

1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Vehicle 98.4
Transportation 98.4
Automobile 98.4
Antique Car 97.6
Person 95.3
Human 95.3
Person 90.2
Person 85.7
Person 77.9
Person 70.8
Hot Rod 70.8
Person 69.2
Model T 65.9
Car 64.3
Person 62.3
Person 61
Convertible 56.3
Machine 56.1
Wheel 56.1
Person 51.6
Person 43.7

Clarifai
created on 2019-03-25

vehicle 99.9
people 99.9
transportation system 99.7
car 99.2
many 98.1
group 97.5
group together 97.4
adult 97.4
driver 93.4
convertible 92.4
road 89.5
man 88.8
one 88.5
several 87.5
vintage 84.8
administration 83.7
wear 83.7
street 83.5
leader 81.8
woman 79.4

Imagga
created on 2019-03-25

car 83
motor vehicle 72.5
vehicle 57.3
amphibian 53
transportation 42.1
speed 38.5
auto 38.3
automobile 35.4
wheeled vehicle 33.3
road 30.7
transport 29.2
travel 27.4
drive 27.4
fast 25.2
city 24.1
wheel 22.1
motor 21.3
luxury 19.7
light 19.4
street 19.3
power 19.3
motion 18.8
urban 18.3
engine 18.3
sky 17.9
racer 17.3
driving 16.4
river 16.1
traffic 16.1
blur 15.8
old 15.3
sport 15
metal 14.5
style 14.1
architecture 14.1
modern 14
scene 13.8
black 13.8
tire 13.6
race 13.4
truck 13.3
sports 12.9
limousine 12.8
wheels 12.7
chrome 12.2
asphalt 12.1
landscape 11.9
color 11.7
vintage 11.6
highway 11.6
blurred 11.5
building 11.3
day 11
moving 10.5
technology 10.4
antique 10.4
classic 10.2
tourism 9.9
automotive 9.8
cars 9.8
expensive 9.6
performance 9.6
design 9.6
bridge 9.5
bumper 9.4
movement 9.4
land 9.2
retro 9
history 8.9
racing 8.8
shiny 8.7
model 8.6
tourist 8.3
reflection 8.2
landmark 8.1
new 8.1
headlight 8.1
detail 8
water 8
summer 7.7
line 7.7
clouds 7.6
side 7.5
silver 7.1

Google
created on 2019-03-25

Motor vehicle 97.4
Photograph 94.8
Vehicle 93.3
Car 89.6
Classic 85.2
Vintage car 80.2
Classic car 73.1
Photography 67.8
Sedan 60.4

Microsoft
created on 2019-03-25

white 72.2
old 47.5
vintage 40.2
street 40.2
car 33.5
black and white 25.3

Face analysis

Amazon

AWS Rekognition

Age 6-13
Gender Male, 50.3%
Sad 49.8%
Calm 49.6%
Happy 49.5%
Surprised 49.5%
Disgusted 49.6%
Confused 49.6%
Angry 49.9%

AWS Rekognition

Age 26-43
Gender Female, 51.3%
Disgusted 50.4%
Surprised 45.6%
Calm 45.5%
Happy 46.1%
Sad 45.3%
Confused 45.6%
Angry 46.4%

AWS Rekognition

Age 11-18
Gender Female, 50.4%
Sad 50.3%
Happy 49.6%
Angry 49.5%
Disgusted 49.6%
Confused 49.5%
Calm 49.5%
Surprised 49.5%

AWS Rekognition

Age 20-38
Gender Female, 50.4%
Confused 49.6%
Angry 49.8%
Surprised 49.5%
Happy 49.5%
Calm 49.6%
Disgusted 49.8%
Sad 49.7%

AWS Rekognition

Age 12-22
Gender Female, 50.5%
Confused 49.6%
Happy 49.6%
Sad 49.8%
Surprised 49.5%
Angry 49.6%
Disgusted 49.5%
Calm 49.9%

AWS Rekognition

Age 15-25
Gender Female, 50.3%
Angry 49.8%
Surprised 49.5%
Sad 49.6%
Calm 49.5%
Confused 49.5%
Happy 49.5%
Disgusted 50%

Feature analysis

Amazon

Person 95.3%
Car 64.3%
Wheel 56.1%

Captions

Microsoft

a vintage photo of a person 86.4%
a vintage photo of a car 75.8%
a vintage photo of a truck 75.7%

Text analysis

Amazon

KODV--2E1--IW