Human Generated Data

Title

Untitled (man and woman at party)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19327

Human Generated Data

Title

Untitled (man and woman at party)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19327

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Chair 98.7
Furniture 98.7
Chair 98.7
Person 97.6
Human 97.6
Person 97.3
Clothing 96.7
Apparel 96.7
Person 89
Chair 88.8
Face 70.3
Door 68.2
Portrait 60.8
Photography 60.8
Photo 60.8
Floor 57.6
Long Sleeve 56.4
Sleeve 56.4

Clarifai
created on 2023-10-22

people 99.6
man 98
art 96.5
wear 96.1
adult 96.1
furniture 94.9
one 93.1
painting 92.8
woman 92.5
portrait 91.5
indoors 91.4
two 90.7
print 90.1
sit 86.8
wedding 86
seat 84.8
illustration 82.9
chair 82.3
leader 82.2
retro 80.5

Imagga
created on 2022-02-25

barbershop 30.3
shop 29.3
salon 25.5
mercantile establishment 20.3
people 19.5
man 18.8
black 18.1
business 17
chair 16
person 15.5
robe 14.5
clothing 13.9
place of business 13.6
office 13.6
hairdresser 13.6
dress 13.5
male 13.5
adult 13.3
old 13.2
room 12.9
interior 12.4
building 12.2
fashion 12.1
modern 11.2
vintage 10.7
history 10.7
garment 10.6
businessman 10.6
indoors 10.5
men 10.3
women 10.3
wall 9.4
inside 9.2
window 9.2
indoor 9.1
art 9.1
silhouette 9.1
love 8.7
glass 8.5
adults 8.5
style 8.2
architecture 7.9
holiday 7.9
professional 7.9
design 7.9
couple 7.8
portrait 7.8
travel 7.7
sitting 7.7
two 7.6
elegance 7.5
clothes 7.5
light 7.3
historic 7.3
lady 7.3
color 7.2
suit 7.2
home 7.2
family 7.1

Google
created on 2022-02-25

Chair 90.5
Picture frame 76.3
Vintage clothing 74.6
Classic 71.7
Room 68.9
Art 67.7
Stock photography 65.6
Table 65.4
Sitting 64.5
Event 63.9
Monochrome photography 63.5
History 61.5
Rectangle 60.5
Monochrome 59.7
Sleeve 57.3
Font 56.3
Painting 50.4

Microsoft
created on 2022-02-25

text 99.4
furniture 94.3
chair 90.9
indoor 89.7
table 88
clothing 81.6
person 73.8
black and white 67.2
picture frame 22.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 98.5%
Happy 99.7%
Surprised 0.1%
Confused 0%
Angry 0%
Fear 0%
Calm 0%
Sad 0%
Disgusted 0%

AWS Rekognition

Age 56-64
Gender Male, 100%
Calm 29.8%
Surprised 23.4%
Sad 11.9%
Fear 10.4%
Confused 9%
Disgusted 6.6%
Angry 6%
Happy 3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair
Person
Chair 98.7%
Chair 98.7%
Chair 88.8%
Person 97.6%
Person 97.3%
Person 89%

Captions

Microsoft
created on 2022-02-25

a person standing in a room 82.3%
a person in a room 81.9%
a person taking a selfie in a room 65.9%

Text analysis

Amazon

JAN
65
13
132
7

Google

13 132 JAN 65
13
132
JAN
65