Human Generated Data

Title

Untitled (couple dancing in long room with woman looking on)

Date

1950

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Human Generated Data

Title

Untitled (couple dancing in long room with woman looking on)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon

Apparel 98.8
Clothing 98.8
Human 98.8
Person 98.8
Person 97.2
Helmet 95.9
Person 95
Helmet 94.5
Overcoat 89.5
Coat 89.5
Suit 89.5
Face 61.2
Female 59.3
Sleeve 57.2
Photography 55.5
Photo 55.5

Clarifai

people 99.7
monochrome 96.2
man 96.1
adult 94.9
group 94.7
woman 91.3
group together 90.2
three 87
child 85.5
two 84.7
family 82.1
wear 81.8
room 79.9
portrait 78.9
administration 78.7
street 76.5
five 76
indoors 76
interaction 72.7
war 72.7

Imagga

shower cap 57
cap 51.8
headdress 40.7
man 35.6
clothing 31.7
people 31.2
male 30.5
person 29.1
adult 26.8
men 21.5
professional 20.8
business 20.6
businessman 18.5
corporate 16.3
indoors 15.8
face 15.6
covering 15.5
work 14.9
consumer goods 14.3
portrait 13.6
handsome 13.4
casual 12.7
office 12.6
job 12.4
patient 12.2
occupation 11.9
women 11.9
worker 11.7
executive 11.6
human 11.2
surgeon 11.2
looking 11.2
black 10.8
room 10.5
health 10.4
senior 10.3
bathing cap 10.2
lifestyle 10.1
suit 9.9
mask 9.8
medical 9.7
one 9.7
success 9.6
hospital 9.6
adults 9.5
sitting 9.4
window 9.4
happy 9.4
doctor 9.4
manager 9.3
horizontal 9.2
hand 9.1
modern 9.1
attractive 9.1
care 9
team 9
building 8.9
teacher 8.9
standing 8.7
businesspeople 8.5
mature 8.4
case 8.2
nurse 8.2
indoor 8.2
equipment 8.2
hat 8.2
chair 7.9
urban 7.9
hands 7.8
surgery 7.8
disease 7.8
corporation 7.7
uniform 7.7
industry 7.7
elderly 7.7
illness 7.6
career 7.6
meeting 7.5
holding 7.4
teamwork 7.4
helmet 7.3
group 7.3
hair 7.1
family 7.1
spectator 7.1
to 7.1
interior 7.1
working 7.1
happiness 7
medicine 7

Google

Microsoft

person 98.9
standing 92.1
black and white 44.7
game 14.7
art 11.2
monochrome 11

Face analysis

Amazon

AWS Rekognition

Age 29-45
Gender Male, 79%
Sad 9.8%
Confused 3.3%
Angry 5.2%
Surprised 6%
Calm 20.3%
Disgusted 5.1%
Happy 50.3%

AWS Rekognition

Age 30-47
Gender Male, 90.4%
Surprised 3.3%
Disgusted 1.5%
Calm 21.4%
Sad 8%
Confused 4.1%
Happy 57.9%
Angry 3.9%

Feature analysis

Amazon

Person 98.8%
Helmet 95.9%

Captions

Microsoft

a group of people standing in a room 95.2%
a man and a woman standing in a room 88.1%
a group of men standing in a room 88%