Human Generated Data

Title

Adler Limousine, 1930-1933

Date

c. 1930-1933

People

Artist: Unidentified Artist,

Classification

Archival Material

Human Generated Data

Title

Adler Limousine, 1930-1933

People

Artist: Unidentified Artist,

Date

c. 1930-1933

Classification

Archival Material

Machine Generated Data

Tags

Amazon

Human 99.5
Person 99.5
Furniture 98.6
Person 87.3
Transportation 81.6
Vehicle 81.4
Automobile 80.1
Car 80.1
Truck 74.7
Apparel 70
Clothing 70
Bed 64.4
Coat 60.5
Overcoat 60.5
Suit 60.5
Couch 60
Caravan 59
Van 59
Machine 56.2
Spoke 56.2

Clarifai

people 99.9
adult 99.3
one 99
vehicle 97.8
furniture 97.6
man 97.1
two 97
woman 95.8
wear 95
leader 92.4
group 91.9
transportation system 91
administration 88.7
room 88.6
chair 88.5
lid 87.7
military 87.6
seat 86.7
veil 86.3
outfit 86.1

Imagga

car 45.9
vehicle 33
kitchen appliance 29.7
toaster 25.5
home appliance 24.2
transportation 22.4
auto 20.1
camper 20.1
automobile 19.1
appliance 17.3
box 17
man 16.8
people 15.6
driver 15.5
room 15.5
driving 15.4
recreational vehicle 15.3
sitting 14.6
transport 13.7
equipment 13.6
furniture 13.4
happy 13.1
adult 12.9
person 12.8
old 12.5
smiling 12.3
computer 11.5
self-propelled vehicle 11.5
container 11.5
drive 11.3
travel 11.3
portrait 11
work 11
business 10.9
office 10.8
male 10.6
monitor 10.5
wheel 10.4
microwave 10.1
professional 10.1
inside 10.1
truck 9.8
wheeled vehicle 9.7
bedroom 9.4
modern 9.1
road 9
black 9
new 8.9
technology 8.9
motor vehicle 8.9
metal 8.8
working 8.8
home 8.8
hospital 8.7
broken 8.7
smile 8.5
classic 8.3
printer 8.2
durables 8.1
television 8
van 7.9
happiness 7.8
luxury 7.7
pretty 7.7
electronic equipment 7.7
casual 7.6
seat 7.6
house 7.5
outdoors 7.5
cheerful 7.3
lifestyle 7.2
table 7.1
summer 7.1

Google

Classic 93.9
Motor vehicle 90.1
Vehicle 84.1
Car 83.7
Vintage car 67.5
Family car 51.5
Classic car 50.7

Microsoft

vehicle 94.4
land vehicle 94
old 73
car 66.3
wheel 62.1

Face analysis

Amazon

Google

AWS Rekognition

Age 26-43
Gender Female, 51.9%
Confused 45.1%
Surprised 45.2%
Sad 45.3%
Angry 45.2%
Happy 45.1%
Calm 54%
Disgusted 45.1%

AWS Rekognition

Age 26-43
Gender Female, 50.9%
Sad 50.8%
Angry 45.5%
Disgusted 45.1%
Surprised 45.2%
Happy 45.4%
Calm 47.8%
Confused 45.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Truck 74.7%

Captions

Microsoft

a vintage photo of a truck 80.8%
an old photo of a truck 80.7%
old photo of a truck 75.4%