Human Generated Data

Title

Untitled (three women seated and looking down at small child on central woman's lap)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Human Generated Data

Title

Untitled (three women seated and looking down at small child on central woman's lap)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon

Person 99.5
Human 99.5
Person 99.3
Person 99.2
Sitting 93.5
Apparel 90.8
Clothing 90.8
Shoe 85
Footwear 85
Baby 82.2
Finger 79.7
People 75.3
Sleeve 71.8
Newborn 70.7
Face 68
Flooring 63.9
Photography 63.8
Photo 63.8
Portrait 63.2
Couch 58.6
Furniture 58.6
Person 58.4
Overcoat 57.9
Coat 57.9
Suit 57.9
Long Sleeve 57.6

Clarifai

child 99.8
people 99.5
group 99.3
family 99
portrait 98.5
baby 98.4
monochrome 98
son 97.8
woman 96.9
love 96.4
offspring 95.2
man 93.8
three 93.3
four 91.2
adult 89.3
daddy 88.2
music 88
group together 87.9
two 87.7
girl 87.3

Imagga

kin 36.8
brother 35.7
adult 30.5
man 30.3
male 28.4
people 26.8
couple 25.3
love 25.3
happy 23.8
portrait 23.3
black 23.2
child 23.1
person 22.8
parent 20.4
face 19.2
attractive 18.9
happiness 18.8
father 18.6
dark 18.4
family 17.8
together 17.5
boy 16.5
dad 16.3
buddy 16.3
youth 16.2
group 16.1
lifestyle 15.9
smile 15.7
looking 15.2
mother 15
smiling 14.5
sibling 14.1
sexy 13.7
cute 13.6
fashion 13.6
fun 13.5
brunette 13.1
sitting 12.9
two 12.7
handsome 12.5
friends 12.2
cheerful 12.2
world 12
model 11.7
hand 11.4
human 11.2
body 11.2
hair 11.1
women 11.1
studio 10.6
husband 10.6
lady 10.6
wife 10.4
men 10.3
relationship 10.3
expression 10.2
childhood 9.8
romantic 9.8
pretty 9.8
kid 9.7
boyfriend 9.6
girlfriend 9.6
hands 9.6
loving 9.5
eyes 9.5
casual 9.3
businessman 8.8
indoors 8.8
look 8.8
business 8.5
togetherness 8.5
leisure 8.3
holding 8.3
girls 8.2
style 8.2
fitness 8.1
romance 8
home 8
horizontal 7.5
lips 7.4
emotion 7.4
indoor 7.3
teenager 7.3
success 7.2
suit 7.2

Microsoft

wall 99.4
person 99.3
baby 98.8
human face 98.7
clothing 97.7
toddler 97.5
smile 94.4
indoor 90.4
text 88.8
boy 82.8
child 79.9
black 77.9
black and white 73.3
woman 60.9
posing 49.9

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-7
Gender Male, 73.2%
Angry 0.3%
Disgusted 0.3%
Happy 26.5%
Calm 72%
Sad 0.2%
Surprised 0.4%
Fear 0%
Confused 0.4%

AWS Rekognition

Age 21-33
Gender Female, 98.9%
Surprised 0.9%
Fear 2%
Angry 9.2%
Disgusted 4%
Sad 50.3%
Calm 22.1%
Happy 5.4%
Confused 6.1%

AWS Rekognition

Age 47-65
Gender Female, 95.8%
Fear 0.2%
Sad 93.4%
Angry 0.1%
Calm 5.7%
Confused 0.4%
Disgusted 0.1%
Happy 0%
Surprised 0%

AWS Rekognition

Age 20-32
Gender Female, 95.6%
Happy 0%
Angry 0.1%
Disgusted 0%
Surprised 0%
Sad 77.9%
Fear 0.1%
Calm 21.8%
Confused 0.2%

Microsoft Cognitive Services

Age 3
Gender Male

Microsoft Cognitive Services

Age 44
Gender Female

Microsoft Cognitive Services

Age 27
Gender Female

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Shoe 85%

Captions

Microsoft

a group of people posing for a photo 96.1%
a group of people posing for the camera 96%
a group of people posing for a picture 95.9%

Text analysis

Amazon

0L6110

Google

DAL 9 7
9
DAL
7