Human Generated Data

Title

Untitled (two women at tea for D.A.R meeting, prints)

Date

c.1970, printed from earlier negative

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18584

Human Generated Data

Title

Untitled (two women at tea for D.A.R meeting, prints)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

c.1970, printed from earlier negative

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18584

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99.5
Human 99.5
Clothing 98.8
Apparel 98.8
Person 97.2
Shoe 81.5
Footwear 81.5
Suit 76.3
Coat 76.3
Overcoat 76.3
Leisure Activities 75.7
Room 73.8
Indoors 73.8
Furniture 72.7
Evening Dress 70.3
Fashion 70.3
Robe 70.3
Gown 70.3
Piano 61.5
Musical Instrument 61.5
Floral Design 56.2
Art 56.2
Graphics 56.2
Pattern 56.2

Clarifai
created on 2023-10-22

people 99.9
woman 98.4
two 98
group 97.5
adult 97.4
wear 93.8
wedding 93.5
man 93.1
portrait 93
dress 92.5
three 91
group together 88.3
leader 87.9
outfit 87.3
actress 86.5
administration 85.5
street 80.1
four 76.1
music 75.4
home 74.9

Imagga
created on 2022-02-25

man 32.3
people 31.2
male 30.7
person 29.9
adult 25.5
business 24.9
businessman 22.1
professional 21.2
office 20.2
portrait 19.4
corporate 18.9
happy 18.8
women 18.2
men 17.2
two 16.9
smiling 16.6
fashion 16.6
couple 16.6
smile 16.4
lady 16.2
life 15.4
executive 15.1
suit 14.8
standing 14.8
outfit 14.2
black 14.1
indoors 14.1
holding 14
together 14
pretty 14
attractive 14
clothing 13.3
lifestyle 13
group 12.9
home 12.8
businesswoman 12.7
work 12.6
handsome 12.5
family 12.4
full length 11.6
interior 11.5
looking 11.2
sax 11.1
expression 11.1
modern 10.5
success 10.5
meeting 10.4
youth 10.2
student 10.1
cute 10
dress 9.9
team 9.9
human 9.7
building 9.7
job 9.7
boy 9.6
wind instrument 9.5
tie 9.5
model 9.3
casual 9.3
teamwork 9.3
house 9.2
indoor 9.1
director 9.1
garment 9
style 8.9
room 8.8
diversity 8.6
happiness 8.6
reading 8.6
talking 8.6
teen 8.3
occupation 8.2
teacher 8.2
worker 8.2
cheerful 8.1
jacket 7.9
teenage 7.7
mother 7.6
fun 7.5
manager 7.4
alone 7.3
color 7.2
love 7.1
to 7.1
guy 7

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

person 99.1
clothing 97.9
text 96.7
indoor 89.6
furniture 86.4
standing 82.9
woman 75.8
dress 66.7
chair 64.8
man 55.1
table 50.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-43
Gender Female, 100%
Surprised 37.5%
Calm 21.8%
Happy 21.5%
Fear 9.6%
Angry 3.3%
Confused 2.6%
Disgusted 2.5%
Sad 1.2%

AWS Rekognition

Age 12-20
Gender Male, 98.9%
Sad 38.1%
Calm 35%
Confused 10.2%
Angry 7.1%
Disgusted 6.8%
Fear 1.2%
Surprised 1.2%
Happy 0.5%

Microsoft Cognitive Services

Age 30
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Suit
Person 99.5%
Person 97.2%
Shoe 81.5%
Suit 76.3%

Text analysis

Google

MI-- YT37A 2-
MI--
YT37A
2-