Human Generated Data

Title

Untitled (two women holding teacup)

Date

1955, printed later

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.132

Human Generated Data

Title

Untitled (two women holding teacup)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1955, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.132

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Clothing 100
Apparel 100
Person 99.3
Human 99.3
Person 92.5
Robe 91
Fashion 91
Gown 87.9
Suit 87
Coat 87
Overcoat 87
Evening Dress 75.5
Floor 75.1
Furniture 73.9
Footwear 73.9
Shoe 73.3
Room 70.9
Indoors 70.9
Home Decor 60.3
Photography 60
Photo 60
Kimono 59.7
Door 56.3

Clarifai
created on 2023-10-25

people 99.9
woman 99.2
two 99
adult 98.2
group 97.5
portrait 97.5
three 96.6
man 94.7
wear 91.6
group together 90.3
administration 90.1
home 88.6
leader 86.5
facial expression 85.8
one 82.4
four 81.7
family 79.9
dress 78.6
doorway 77.3
girl 74.6

Imagga
created on 2022-01-08

people 31.2
couple 27.9
person 27.9
male 25.8
adult 24.9
man 24.9
portrait 22.7
dress 22.6
fashion 21.9
two 20.3
happy 20.1
attractive 18.2
clothing 18.2
life 16.7
smiling 16.6
women 16.6
business 16.4
smile 16.4
groom 15.3
pretty 14.7
lady 14.6
lifestyle 14.5
happiness 14.1
together 14
standing 13.9
men 13.7
black 13.5
suit 13.3
indoors 13.2
professional 12.8
full length 12.6
bride 12.5
family 12.5
cheerful 12.2
executive 12.2
corporate 12
love 11.8
sax 11.8
mother 11.6
handsome 11.6
businessman 11.5
boy 11.3
office 11.2
looking 11.2
casual 11
garment 10.8
holding 10.7
old 10.5
sexy 10.4
style 10.4
home 10.4
youth 10.2
wedding 10.1
model 10.1
indoor 10
outfit 10
bouquet 9.7
interior 9.7
grandma 9.6
building 9.6
room 9.3
child 9.2
teen 9.2
romantic 8.9
cute 8.6
kin 8.6
expression 8.5
face 8.5
senior 8.4
clothes 8.4
modern 8.4
human 8.2
fun 8.2
confident 8.2
stylish 8.1
group 8.1
romance 8
jacket 8
hair 7.9
sitting 7.7
married 7.7
director 7.6
tie 7.6
elegance 7.6
one 7.5
outdoors 7.5
teenager 7.3
pose 7.2
to 7.1

Google
created on 2022-01-08

Picture frame 95.5
Coat 87.4
Standing 86.4
Gesture 85.3
Black-and-white 85.2
Style 83.9
Fashion design 76.1
Blazer 75.9
Monochrome 75.3
Monochrome photography 74.1
Classic 73.2
Chair 72.9
Suit 72.1
Formal wear 71.9
Event 71.6
Pattern 70.9
Smile 70.4
Vintage clothing 70.3
Art 70.1
Room 67.3

Microsoft
created on 2022-01-08

clothing 98.4
person 98.4
text 97.5
standing 86.5
dress 84.1
smile 78.5
woman 64.4
suit 61.5
furniture 58.4
man 55.5
posing 43.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 37-45
Gender Female, 100%
Surprised 32.2%
Calm 28.3%
Angry 14.6%
Fear 8.8%
Confused 6%
Happy 3.9%
Disgusted 3.8%
Sad 2.3%

AWS Rekognition

Age 11-19
Gender Male, 99.4%
Sad 38.9%
Calm 30.3%
Confused 17.7%
Angry 5.5%
Disgusted 4.2%
Fear 1.6%
Surprised 1.2%
Happy 0.6%

Microsoft Cognitive Services

Age 30
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Suit 87%
Shoe 73.3%