Human Generated Data

Title

Girls Playing House

Date

1979-1983

People

Artist: Mary E. Frey, American 20th century

Classification

Photographs

Human Generated Data

Title

Girls Playing House

People

Artist: Mary E. Frey, American 20th century

Date

1979-1983

Classification

Photographs

Machine Generated Data

Tags

Amazon

Person 99.2
Human 99.2
Person 98.3
Room 98.2
Indoors 98.2
Person 98.1
Person 93.8
Person 93.6
Classroom 87.2
School 87.2
Bedroom 83.2
Person 82.7
Living Room 79.3
Person 77.4
People 69.2
Person 59.7
Workshop 59.2
Kindergarten 57.9
Bed 57.4
Furniture 57.4
Dorm Room 56.5
Clinic 56.2

Clarifai

people 99.9
group 99.7
child 98.9
group together 98.8
education 96.8
woman 96.6
room 96.3
school 96.1
man 96
many 95.1
adult 94.7
several 92.6
family 92.5
boy 90.5
recreation 89.7
five 89.6
indoors 89.1
furniture 87.5
four 87.3
music 85.5

Imagga

room 55.2
classroom 41.6
interior 31.8
building 28.5
modern 24.5
man 22.8
indoors 21.1
school 20.7
home 20.7
people 20.6
window 18.3
table 17.7
shop 17.3
barbershop 16.7
gymnasium 15.9
house 15.9
chair 15.9
inside 15.6
male 14.9
light 14.7
furniture 14.6
business 14.6
person 14.4
decor 14.1
structure 14.1
office 14
floor 13.9
men 13.7
hospital 13.5
luxury 12.8
indoor 12.8
architecture 12.5
glass 12.4
3d 12.4
lifestyle 11.5
apartment 11.5
athletic facility 11.2
women 11.1
mercantile establishment 10.9
kitchen 10.7
design 10.7
equipment 10.6
businessman 10.6
residential 10.5
weight 10.4
smiling 10.1
wood 10
city 10
living room 9.8
urban 9.6
life 9.6
sofa 9.6
living 9.5
wall 9.4
happy 9.4
two 9.3
adult 9.3
elegance 9.2
silhouette 9.1
cheerful 8.9
new 8.9
spacious 8.9
sports equipment 8.8
facility 8.8
barbell 8.7
comfortable 8.6
lamp 8.6
meeting 8.5
black 8.4
team 8.1
group 8.1
working 7.9
together 7.9
sport 7.7
decoration 7.6
holding 7.4
training 7.4
place of business 7.3
steel 7.1
work 7.1
happiness 7

Google

Microsoft

wall 98.8
indoor 97.2
person 84.1
clothing 78.4
text 77.7
room 63.1
gallery 55.9
dog 51.9
several 12.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 3-11
Gender Male, 87.5%
Sad 69.7%
Calm 21.4%
Confused 6%
Angry 1.6%
Disgusted 0.4%
Fear 0.4%
Surprised 0.3%
Happy 0.2%

AWS Rekognition

Age 13-23
Gender Male, 94.2%
Happy 98.8%
Surprised 1%
Calm 0.1%
Angry 0%
Confused 0%
Fear 0%
Sad 0%
Disgusted 0%

AWS Rekognition

Age 12-22
Gender Male, 82.3%
Happy 82.2%
Calm 10.8%
Fear 2%
Angry 1.8%
Surprised 1.5%
Confused 1%
Sad 0.5%
Disgusted 0.2%

AWS Rekognition

Age 17-29
Gender Female, 98.1%
Happy 97.5%
Calm 0.8%
Angry 0.5%
Fear 0.3%
Surprised 0.3%
Sad 0.2%
Disgusted 0.2%
Confused 0.2%

AWS Rekognition

Age 20-32
Gender Female, 99.7%
Sad 84.4%
Calm 9.6%
Happy 2.3%
Fear 1.6%
Angry 1.4%
Surprised 0.3%
Confused 0.2%
Disgusted 0.2%

AWS Rekognition

Age 0-3
Gender Female, 78.8%
Calm 61.5%
Sad 13%
Fear 12.6%
Surprised 6.7%
Happy 2.8%
Angry 1.9%
Confused 1.2%
Disgusted 0.3%

Microsoft Cognitive Services

Age 44
Gender Female

Microsoft Cognitive Services

Age 25
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

Kristy McNichol et al. sitting in a room 93.6%
Kristy McNichol et al. in a room 93.5%
Kristy McNichol et al. sitting and standing in a room 93.2%

Text analysis

Amazon

NEW
NEW YORA
YORA
60608000
1006000 60608000
1006000
o.0

Google

NEW YORK
NEW
YORK