Human Generated Data

Title

Woman (daguerrotype)

Date

19th century

People
Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Bequest of Alice B. Lorris, BR71.18.I

Human Generated Data

Title

Woman (daguerrotype)

People
Date

19th century

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Window 98.4
Human 97.2
Person 97.2
Porthole 89.2
Painting 60.6
Art 60.6

Imagga
created on 2022-02-26

washer 100
white goods 100
home appliance 88
appliance 64
durables 29.3
close 20
equipment 14.9
technology 13.4
wheel 12.4
old 11.8
detail 11.3
machine 10.7
metal 10.5
business 10.3
dollar 10.2
money 10.2
currency 9.9
design 9.6
door 9.5
laundry 9.5
paper 9.4
sound 9.4
finance 9.3
clean 9.2
music 9
wealth 9
digital 8.9
washing 8.7
circle 8.6
bill 8.6
black 8.4
texture 8.3
entertainment 8.3
vintage 8.3
closeup 8.1
silver 8
hole 7.9
curve 7.9
glass 7.8
window 7.8
retro 7.4
cash 7.3
color 7.2
art 7.2

Google
created on 2022-02-26

Picture frame 94.7
Wood 84.7
Art 82.4
Rectangle 77.4
Circle 76.9
Font 75
Painting 69.7
Visual arts 69.1
Illustration 67.6
Room 56.7
Carving 56.2
Metal 55.5
History 54.8
Still life 51.2

Microsoft
created on 2022-02-26

person 95.5
art 87
clothing 86.5
human face 85.4
man 69.3
appliance 57.9

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 51.9%
Calm 64.7%
Sad 21.5%
Fear 6.8%
Confused 2.6%
Angry 1.4%
Disgusted 1.3%
Surprised 1.2%
Happy 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.2%
Painting 60.6%

Captions

Microsoft

a cat sitting on top of a door 38.3%
a cat that is looking at the camera 36.3%
a cat sitting next to a door 27.7%