Human Generated Data

Title

Untitled (old photograph of man and woman)

Date

c. 1945

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19159

Human Generated Data

Title

Untitled (old photograph of man and woman)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.5
Person 99.5
Person 99.5
Face 91.8
Accessories 90.2
Accessory 90.2
Tie 90.2
Clothing 86
Apparel 86
Home Decor 84.9
Nature 71.4
Outdoors 67.3
Electronics 63.1
Screen 63.1
Hat 62.5
Head 61.5
Overcoat 60.7
Suit 60.7
Coat 60.7
Monitor 60
Display 60
Linen 59.2
Cake 58.2
Cream 58.2
Dessert 58.2
Icing 58.2
Food 58.2
Creme 58.2
Brick 57
Steamer 55.8
Text 55.8
Advertisement 55.1

Imagga
created on 2022-03-05

television 97
telecommunication system 67.9
monitor 39.5
laptop 31.2
computer 28
person 25.6
screen 24.9
technology 23.8
business 23.7
man 23.5
adult 22
background 21.8
liquid crystal display 20.8
black 20.4
people 20.1
portrait 20.1
display 19.7
equipment 19.1
electronic equipment 19
office 18.5
attractive 18.2
work 18.1
male 17.7
happy 16.9
smile 16.4
working 15.9
face 14.9
desk 14.2
modern 14
sitting 13.7
smiling 13
car 12.9
driver 12.6
notebook 12.6
pretty 12.6
businessman 11.5
lady 11.4
vehicle 11.2
one 11.2
looking 11.2
businesswoman 10.9
telephone 10.4
student 10
suit 9.9
automobile 9.6
hair 9.5
corporate 9.5
keyboard 9.4
expression 9.4
professional 9.3
worker 8.9
information 8.9
electronic device 8.8
conceptual 8.8
indoors 8.8
broadcasting 8.7
communication 8.4
fashion 8.3
holding 8.3
transportation 8.1
sexy 8
home 8
job 8
interior 8
typing 7.8
model 7.8
driving 7.7
studio 7.6
hand 7.6
tie 7.6
head 7.6
electronic 7.5
manager 7.4
service 7.4
phone 7.4
alone 7.3
danger 7.3
sensuality 7.3
make 7.3
women 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 99.9
book 99.5
human face 98.9
posing 98.1
person 95.9
black 90.5
clothing 86.9
old 85.8
drawing 81.5
portrait 79.1
sketch 77.6
white 77.5
smile 75.9
man 74.1
painting 65.9
black and white 57.6
vintage 36.5

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Female, 93.6%
Calm 99.3%
Surprised 0.3%
Sad 0.2%
Angry 0.1%
Happy 0.1%
Confused 0%
Fear 0%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Tie 90.2%
Suit 60.7%

Captions

Microsoft

a vintage photo of a man and woman posing for a picture 79.6%
a vintage photo of a man and a woman posing for a picture 79.5%
a vintage photo of a man and woman posing for the camera 75.1%