Human Generated Data

Title

Adler Limousine, 1930-1933

Date

c. 1930-1933

People

Artist: Unidentified Artist,

Classification

Archival Material

Human Generated Data

Title

Adler Limousine, 1930-1933

People

Artist: Unidentified Artist,

Date

c. 1930-1933

Classification

Archival Material

Machine Generated Data

Tags

Amazon

Person 99.4
Human 99.4
Furniture 99.3
Person 87.2
Transportation 85
Truck 85
Vehicle 85
Car 67.7
Automobile 67.7
Couch 64.3
Helmet 62.4
Clothing 62.4
Apparel 62.4
Cradle 58.4
Crib 55.7
Machine 55
Spoke 55

Clarifai

people 99.8
adult 99.2
one 98.7
vehicle 97.8
man 97.8
two 96.3
furniture 96.2
wear 95.2
woman 93.5
transportation system 93.3
leader 89.7
group 88.6
retro 88.4
military 86.9
lid 86.8
veil 85.9
administration 85.4
group together 84.3
monochrome 84
chair 83.5

Imagga

car 40.8
vehicle 31.5
kitchen appliance 22.1
transportation 20.6
box 20.3
furniture 20.2
home appliance 19.5
auto 17.2
toaster 16.8
automobile 16.3
room 16.3
equipment 15.4
device 14.8
driving 14.5
appliance 14.1
man 14.1
old 13.9
printer 12.9
people 12.8
driver 12.6
container 12.4
metal 12.1
bedroom 12
travel 12
home 12
truck 11.9
transport 11.9
business 11.5
floor 11.1
seat 11
modern 10.5
machine 10.5
luxury 10.3
window 10.1
house 10
office 10
camper 9.8
lamp 9.6
drive 9.5
sitting 9.4
wheel 9.4
classic 9.3
male 9.2
microwave 9.1
vintage 9.1
furnishing 9.1
adult 9
person 9
black 9
baby bed 8.8
iron lung 8.6
apparatus 8.5
wood 8.3
inside 8.3
crib 8.2
computer 8.2
technology 8.2
happy 8.1
crate 8.1
motor vehicle 7.9
chair 7.9
work 7.8
bed 7.8
pillow 7.8
portrait 7.8
broken 7.7
table 7.7
chest 7.6
military vehicle 7.6
outdoors 7.5
retro 7.4
smiling 7.2
antique 7.1
recreational vehicle 7.1
interior 7.1
working 7.1
wooden 7

Google

Microsoft

vehicle 94.6
land vehicle 94.4
old 80.6
wheel 65.6
car 61
van 60.9

Face analysis

Amazon

Google

AWS Rekognition

Age 26-43
Gender Female, 50.3%
Calm 53.8%
Surprised 45.3%
Sad 45.4%
Confused 45.1%
Disgusted 45.1%
Happy 45.1%
Angry 45.3%

AWS Rekognition

Age 35-52
Gender Female, 54.5%
Happy 51.8%
Sad 47.6%
Surprised 45.1%
Confused 45%
Disgusted 45%
Calm 45.3%
Angry 45.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Truck 85%
Helmet 62.4%

Captions

Microsoft

a vintage photo of a truck 87.2%
a vintage photo of a person in a car 75.8%
an old photo of a truck 75.7%