Human Generated Data

Title

Untitled (eight family members posed sitting in living room next to Christmas tree)

Date

1949

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Human Generated Data

Title

Untitled (eight family members posed sitting in living room next to Christmas tree)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1949

Classification

Photographs

Machine Generated Data

Tags

Amazon

Person 99
Human 99
Person 98.2
Person 97.4
Person 96.9
Person 95.6
Apparel 94.9
Clothing 94.9
Person 94.2
Person 93.3
Person 91.4
Stage 83.4
People 75.7
Footwear 73.9
Shoe 73.9
Furniture 68.3
Female 64.6
Tablecloth 64.1
Face 61.8
Dress 60.2
Photography 59.9
Photo 59.9
Fashion 59.2
Robe 59.2
Flooring 58.8
Indoors 57.9
Linen 57.6
Home Decor 57.6
Floor 57.4
Gown 56.4

Clarifai

people 99.8
group 99.6
group together 99.2
man 97.6
many 97.3
leader 97
adult 97
woman 96.7
administration 93.2
meeting 90.3
several 87.4
music 87.4
child 86.7
wear 85.1
five 84.5
furniture 82
education 81.8
outfit 80.9
room 80.4
chair 80.1

Imagga

musical instrument 59
marimba 52.3
percussion instrument 44.5
man 41.1
stringed instrument 33.9
people 32.9
male 30.5
person 28.3
businessman 28.2
business 27.3
women 25.3
sitting 24.9
adult 22.9
meeting 22.6
men 22.3
group 20.9
table 19.9
team 19.7
happy 19.4
office 19.4
couple 19.2
device 18.4
together 16.6
smiling 16.6
lifestyle 16.6
portrait 16.2
room 16.1
teacher 15.9
businesswoman 15.4
professional 15.3
businesspeople 15.2
executive 15.1
desk 14.2
interior 14.1
work 14.1
indoors 14.1
mature 13.9
student 13.9
archive 13.8
corporate 13.7
manager 13
cheerful 13
indoor 12.8
two 12.7
suit 12.6
job 12.4
talking 12.4
modern 11.9
love 11.8
communication 11.8
colleagues 11.7
worker 11
holding 10.7
smile 10.7
standing 10.4
education 10.4
friends 10.3
teamwork 10.2
classroom 10
success 9.7
four 9.6
happiness 9.4
senior 9.4
relaxation 9.2
copy space 9
groom 9
home 8.8
discussion 8.8
30s 8.7
day 8.6
drinking 8.6
boss 8.6
glass 8.6
enjoyment 8.4
presentation 8.4
training 8.3
life 8.3
20s 8.2
romantic 8
night 8
working 8
holiday 7.9
conference 7.8
40s 7.8
color 7.8
restaurant 7.8
full length 7.8
husband 7.6
casual 7.6
finance 7.6
hand 7.6
wife 7.6
adults 7.6
togetherness 7.6
wine 7.4
phone 7.4
new 7.3
confident 7.3
idea 7.1
family 7.1

Google

Microsoft

floor 93.6
indoor 87.8
ballet 87.8

Face analysis

Amazon

AWS Rekognition

Age 48-68
Gender Male, 52.3%
Happy 45.1%
Confused 45.2%
Calm 45.6%
Sad 53.6%
Disgusted 45.1%
Angry 45.3%
Surprised 45.1%

AWS Rekognition

Age 14-25
Gender Male, 54.7%
Sad 46.8%
Confused 45.1%
Happy 46.9%
Angry 45.2%
Calm 50.6%
Disgusted 45.1%
Surprised 45.1%

AWS Rekognition

Age 11-18
Gender Female, 51.4%
Sad 47.4%
Disgusted 45.2%
Calm 48.6%
Happy 47.4%
Confused 45.4%
Surprised 45.3%
Angry 45.7%

AWS Rekognition

Age 20-38
Gender Male, 51.4%
Disgusted 45.4%
Confused 45.3%
Surprised 45.2%
Angry 45.7%
Happy 45.4%
Sad 47.4%
Calm 50.7%

AWS Rekognition

Age 26-43
Gender Male, 54.5%
Happy 53.4%
Calm 45.8%
Sad 45.3%
Angry 45.2%
Surprised 45.1%
Disgusted 45.1%
Confused 45.1%

AWS Rekognition

Age 35-52
Gender Male, 51%
Sad 45.8%
Angry 45.2%
Surprised 45.1%
Happy 53.1%
Calm 45.4%
Confused 45.1%
Disgusted 45.2%

AWS Rekognition

Age 27-44
Gender Male, 54.3%
Happy 45.5%
Disgusted 45.3%
Sad 49.5%
Angry 45.3%
Confused 45.3%
Surprised 45.2%
Calm 48.9%

AWS Rekognition

Age 30-47
Gender Female, 50.1%
Angry 49.5%
Surprised 49.5%
Sad 50.4%
Calm 49.5%
Confused 49.5%
Happy 49.5%
Disgusted 49.5%

Feature analysis

Amazon

Person 99%
Shoe 73.9%

Captions

Microsoft

a group of people in a room 95%
a group of people standing in a room 93.3%
a group of people sitting at a table 80.5%

Text analysis

Amazon

O