Human Generated Data

Title

Untitled (man and two women at party)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19314

Human Generated Data

Title

Untitled (man and two women at party)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.8
Apparel 99.8
Sleeve 99.5
Person 99.5
Human 99.5
Home Decor 99
Person 98.9
Person 98.4
Chair 95.8
Furniture 95.8
Linen 92.5
Long Sleeve 92.1
Shorts 84.7
Shirt 68.1
Pants 65.6
Sunglasses 64.2
Accessories 64.2
Accessory 64.2
Female 63.4
Door 58.3
Blouse 57.5
Woman 55.3

Imagga
created on 2022-03-05

bathing cap 57.8
cap 47.4
clothing 41.9
headdress 36.1
person 32
man 24.2
adult 22.9
people 22.3
portrait 20.7
fashion 20.3
model 20.2
male 19.8
dress 19
covering 18.1
happy 16.3
black 15.7
pretty 15.4
turnstile 15.4
women 15
attractive 14.7
face 14.2
crutch 13.9
human 13.5
consumer goods 13.2
lady 13
business 12.7
staff 12.6
gate 12.1
one 11.9
professional 11.9
bride 11.5
standing 11.3
sexy 11.2
clothes 11.2
looking 11.2
men 11.2
love 11
elegance 10.9
elegant 10.3
garment 10.3
lifestyle 10.1
wedding 10.1
group 9.7
urban 9.6
body 9.6
couple 9.6
movable barrier 9.5
window 9.4
wet suit 9.1
building 8.9
style 8.9
gown 8.9
hair 8.7
smiling 8.7
happiness 8.6
smile 8.5
health 8.3
city 8.3
inside 8.3
holding 8.2
sensuality 8.2
pose 8.1
worker 8.1
stick 8.1
posing 8
indoors 7.9
cute 7.9
work 7.8
brunette 7.8
eyes 7.7
two 7.6
fashionable 7.6
outdoors 7.5
team 7.2
job 7.1
businessman 7.1
together 7
look 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 97.4
person 95
clothing 87.1
black and white 84.4
standing 79.9
posing 77.6
footwear 66.1

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 100%
Surprised 23.5%
Calm 20.6%
Happy 13.3%
Fear 12.6%
Disgusted 11%
Confused 8.4%
Angry 6.8%
Sad 3.8%

AWS Rekognition

Age 48-54
Gender Male, 93.6%
Confused 60.1%
Sad 15%
Calm 12.3%
Surprised 7.7%
Happy 2.1%
Disgusted 1.8%
Angry 0.6%
Fear 0.4%

AWS Rekognition

Age 38-46
Gender Male, 75.3%
Happy 99.6%
Calm 0.3%
Confused 0%
Sad 0%
Disgusted 0%
Surprised 0%
Fear 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 95.8%
Sunglasses 64.2%

Captions

Microsoft

a group of people posing for a photo 97%
a group of people posing for the camera 96.9%
a group of people posing for a picture 96.8%

Text analysis

Amazon

6
MAQOX
YT33A2
M 117 YT33A2 MAQOX
MIII YT37A2 MAQOX
M 117
MIII
YT37A2

Google

KODYK 2.YEETA ETIW KODYK
KODYK
ETIW
2.YEETA