Human Generated Data

Title

Untitled (women displaying crafts)

Date

1956, printed later

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.381

Human Generated Data

Title

Untitled (women displaying crafts)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1956, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.381

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.3
Human 99.3
Person 99.3
Person 99.1
Person 96.6
Person 93.8
People 91.1
Photography 66.3
Photo 66.3
Text 66
Family 62.7
Flooring 60.4
Female 59.4
Wood 56.3
Furniture 55.2

Clarifai
created on 2023-10-25

people 99.9
woman 99.3
group 99.2
adult 98.7
man 97.2
actress 95.1
group together 95
portrait 95
wear 90
family 89.6
three 89.2
dress 89.1
four 88.2
facial expression 87.6
music 87.3
musician 87.3
singer 87.1
retro 86.4
monochrome 86.2
child 86

Imagga
created on 2022-01-08

person 28.3
adult 24.8
people 24.5
man 22.2
fashion 20.3
portrait 18.1
musical instrument 18
male 17.8
sexy 17.7
women 17.4
pretty 16.8
business 16.4
model 15.5
black 15.2
lifestyle 15.2
happy 15
attractive 14.7
lady 14.6
wind instrument 13.4
style 13.3
bag 13.3
businessman 13.2
smiling 13
smile 12.8
dress 12.6
happiness 12.5
clothing 12.2
brunette 12.2
cute 12.2
holding 11.5
modern 11.2
professional 11
mother 11
bags 10.7
studio 10.6
cheerful 10.6
hair 10.3
clothes 10.3
shopping 10.2
youth 10.2
brass 10.1
family 9.8
fun 9.7
together 9.6
couple 9.6
standing 9.6
casual 9.3
face 9.2
performer 9.2
gorgeous 9.1
pose 9.1
human 9
sitting 8.6
elegant 8.6
hold 8.3
executive 8.3
indoor 8.2
child 8.1
suit 8.1
success 8
worker 8
office 8
looking 8
love 7.9
shop 7.6
one 7.5
active 7.3
new 7.3
scholar 7.2
handsome 7.1
stringed instrument 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 98.9
wall 98.1
clothing 96.9
woman 91.8
posing 90
text 87
smile 78
drawing 76.1
group 73.6
people 65.2
man 59.4
room 40.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 54-62
Gender Female, 89.7%
Calm 98.4%
Confused 0.5%
Disgusted 0.4%
Sad 0.2%
Angry 0.2%
Surprised 0.2%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 24-34
Gender Female, 99.5%
Calm 93.1%
Happy 2.7%
Angry 2.3%
Confused 0.6%
Surprised 0.4%
Sad 0.4%
Fear 0.2%
Disgusted 0.2%

AWS Rekognition

Age 36-44
Gender Female, 100%
Happy 85.4%
Surprised 8.9%
Calm 2%
Fear 1.2%
Angry 0.9%
Confused 0.7%
Disgusted 0.5%
Sad 0.4%

AWS Rekognition

Age 57-65
Gender Female, 100%
Calm 88.4%
Surprised 6.7%
Confused 2.6%
Fear 0.7%
Angry 0.5%
Sad 0.5%
Disgusted 0.3%
Happy 0.3%

AWS Rekognition

Age 72-82
Gender Female, 57.1%
Surprised 99.6%
Calm 0.3%
Fear 0.1%
Disgusted 0%
Angry 0%
Happy 0%
Confused 0%
Sad 0%

Microsoft Cognitive Services

Age 50
Gender Female

Microsoft Cognitive Services

Age 60
Gender Female

Microsoft Cognitive Services

Age 58
Gender Male

Microsoft Cognitive Services

Age 32
Gender Female

Microsoft Cognitive Services

Age 67
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Categories