Human Generated Data

Title

Untitled (old photograph of man and woman)

Date

c. 1945

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19160

Human Generated Data

Title

Untitled (old photograph of man and woman)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.7
Person 99.7
Person 99.3
Tie 90
Accessories 90
Accessory 90
Interior Design 87.9
Indoors 87.9
Face 79.7
Apparel 77.5
Clothing 77.5
Monitor 75.5
Electronics 75.5
Display 75.5
Screen 75.5
Performer 62.6
Room 60
Hat 57.1

Imagga
created on 2022-03-05

television 96
telecommunication system 60.6
monitor 35.7
screen 26.7
background 26.6
person 24
business 23.7
display 23.3
computer 23.2
office 20.9
adult 20.7
black 20.5
man 20.2
laptop 19.9
people 19.5
electronic equipment 19.4
car 18.5
portrait 18.1
equipment 17.9
technology 17.8
attractive 16.8
liquid crystal display 16.6
work 16.5
broadcasting 16
male 15.6
sitting 15.5
working 15
driver 14.6
vehicle 14
happy 13.8
smiling 13.7
modern 13.3
desk 13.2
face 12.8
one 12.7
transportation 12.6
telecommunication 12.2
smile 12.1
pretty 11.9
automobile 11.5
businessman 11.5
hand 11.4
human 11.2
electronic device 10.7
looking 10.4
suit 9.9
lady 9.7
corporate 9.4
keyboard 9.4
professional 9.3
inside 9.2
businesswoman 9.1
holding 9.1
student 9.1
notebook 8.9
hair 8.7
driving 8.7
drive 8.5
telephone 8.5
manager 8.4
entertainment 8.3
alone 8.2
worker 8
job 8
interior 8
table 7.8
auto 7.7
communication 7.6
wheel 7.5
fashion 7.5
electronic 7.5
medium 7.4
new 7.3
danger 7.3
women 7.1
indoors 7

Microsoft
created on 2022-03-05

text 99.9
human face 98.5
book 98
person 94.8
posing 88.1
black 86.9
old 86.7
black and white 86.4
clothing 84.2
portrait 77.7
smile 75.2
white 71.3
vintage 28.1

Face analysis

Amazon

AWS Rekognition

Age 16-22
Gender Female, 84.4%
Calm 85.4%
Sad 8.6%
Surprised 2.9%
Angry 1.7%
Confused 0.4%
Fear 0.4%
Happy 0.3%
Disgusted 0.3%

Feature analysis

Amazon

Person 99.7%
Tie 90%

Captions

Microsoft

a vintage photo of a man 92.5%
a vintage photo of a man and woman posing for a picture 70.5%
a vintage photo of a man and a woman posing for a picture 70.4%