Human Generated Data

Title

Untitled (debutantes preparing for ball)

Date

c. 1966

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19261

Human Generated Data

Title

Untitled (debutantes preparing for ball)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1966

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 97.2
Human 97.2
Interior Design 92.7
Indoors 92.7
Room 92.3
Person 88
Furniture 84.5
Person 78.2
LCD Screen 69.7
Monitor 69.7
Screen 69.7
Display 69.7
Electronics 69.7
Girl 61.4
Female 61.4
Living Room 57.4

Imagga
created on 2022-03-05

furniture 43
table 38.4
room 29.3
interior 27.4
home 22.3
indoors 21.1
lamp 19.3
desk 18.5
furnishing 18.2
chair 18.1
window 16.6
glass 15.9
house 15
office 14.9
computer 14.9
modern 14.7
luxury 14.6
decor 14.1
equipment 13.5
monitor 13.3
table lamp 13.1
design 12.9
technology 12.6
salon 12.6
wood 12.5
comfortable 12.4
working 12.4
device 11.8
cabinet 11.8
work 11.8
decoration 11.6
man 11.4
black 11.4
light 11.4
people 11.1
business 10.9
chairs 10.8
restaurant 10.6
dinner 10.4
style 10.4
adult 10.3
kitchen 10.3
wall 10.3
inside 10.1
health 9.7
medical 9.7
hospital 9.6
person 9.5
keyboard 9.4
male 9.2
color 8.9
healthy 8.8
lifestyle 8.7
elegant 8.6
dining 8.6
wine 8.4
china cabinet 8.4
floor 8.4
glasses 8.3
television 8.3
indoor 8.2
worker 8
food 8
medicine 7.9
vessel 7.8
elegance 7.6
relaxation 7.5
bottle 7.5
doctor 7.5
wineglass 7.4
electronic equipment 7.4
cup 7.3
domestic 7.2
smiling 7.2
alcohol 7.2
vase 7.1
portrait 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 97.7
indoor 97.3
black and white 94.6
person 80
computer 78.1
clothing 51.1

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 99.7%
Calm 77.1%
Confused 10%
Sad 9.3%
Angry 1.2%
Disgusted 0.9%
Surprised 0.7%
Happy 0.5%
Fear 0.4%

AWS Rekognition

Age 23-31
Gender Male, 90%
Sad 45.7%
Calm 42.3%
Angry 3.1%
Happy 3%
Confused 2.5%
Disgusted 1.5%
Fear 1.2%
Surprised 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.2%

Text analysis

Amazon

6
F
MAQOX
KAGOX
Y133
M 113 YT334 A°2 MAQOX
A°2
MII3 Y133 A°S KAGOX
A°S
MII3
M 113 YT334

Google

COEET
KODYK
EITW
KODYK COEET EITW KODYK 2 EE
2
EE