Human Generated Data

Title

Untitled (women at party)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19317

Human Generated Data

Title

Untitled (women at party)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19317

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 98.9
Human 98.9
Clothing 98.6
Apparel 98.6
Person 98.1
Evening Dress 92.8
Fashion 92.8
Robe 92.8
Gown 92.8
Person 88.1
Costume 79.1
Door 66.5
Shoe 58.5
Footwear 58.5
Female 58
Back 55.3

Clarifai
created on 2023-10-22

people 99.6
wedding 98.4
woman 98.4
man 97
adult 96.5
two 95.3
bride 94.5
wear 93.9
groom 93.9
family 91.6
dress 91.5
love 89.3
doorway 88.7
indoors 88.4
group 85.7
actress 83
child 82.6
portrait 80.9
ceremony 80.1
affection 79.6

Imagga
created on 2022-02-25

call 56.5
man 31.6
male 24.1
people 23.4
person 22.2
black 21.2
adult 19.7
dress 16.3
attractive 16.1
fashion 15.8
portrait 15.5
device 14.5
lifestyle 14.4
suit 14
pretty 14
locker 13.8
looking 13.6
business 13.4
urban 13.1
sexy 12.8
casual 12.7
businessman 12.4
men 12
human 12
occupation 11.9
building 11.9
hair 11.9
work 11.8
standing 11.3
fastener 11
model 10.9
city 10.8
professional 10.8
face 10.6
happy 10.6
lady 10.5
couple 10.4
women 10.3
elegance 10.1
job 9.7
one 9.7
life 9.7
indoors 9.7
love 9.5
restraint 9.1
jacket 9.1
necktie 9.1
bow tie 8.9
telephone 8.7
brunette 8.7
hands 8.7
wall 8.5
youth 8.5
silhouette 8.3
office 8.2
style 8.2
gorgeous 8.2
posing 8
body 8
cute 7.9
happiness 7.8
corporate 7.7
modern 7.7
power 7.6
window 7.5
romance 7.1
smile 7.1
working 7.1

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 98.8
person 96.7
wall 95.3
woman 90.7
dress 89.5
poster 87.7
clothing 86.5
posing 66.9
dressed 41.4
picture frame 15

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Female, 54.3%
Calm 55.7%
Happy 42%
Sad 0.8%
Angry 0.5%
Disgusted 0.4%
Fear 0.4%
Surprised 0.2%
Confused 0.1%

AWS Rekognition

Age 20-28
Gender Female, 97.1%
Happy 87.6%
Disgusted 5.5%
Sad 3%
Angry 1.2%
Calm 1%
Surprised 0.6%
Fear 0.6%
Confused 0.5%

AWS Rekognition

Age 25-35
Gender Female, 97.5%
Fear 98.1%
Calm 1.2%
Sad 0.3%
Surprised 0.2%
Angry 0.1%
Confused 0.1%
Disgusted 0%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.9%
Person 98.1%
Person 88.1%

Categories

Text analysis

Amazon

65
JAN
167
132
2

Google

Lat 132 JAN 65
Lat
132
JAN
65