Human Generated Data

Title

Untitled (woman and child, seated, three-quarter length)

Date

1839 - c. 1860

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3876

Human Generated Data

Title

Untitled (woman and child, seated, three-quarter length)

People

Artist: Unidentified Artist,

Date

1839 - c. 1860

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.3
Human 99.3
Person 98.5
Painting 94.4
Art 94.4
Clothing 88.2
Apparel 88.2
Photography 68.5
Portrait 68.5
Photo 68.5
Face 68.5
Girl 62.5
Female 62.5
Drawing 56.4

Imagga
created on 2022-01-29

container 69.6
tray 62.5
receptacle 50.6
wallet 42.2
old 41.8
vintage 40.5
retro 36.1
case 31.7
frame 30.9
grunge 29
antique 21.6
art 21.6
texture 20.8
ancient 20.8
empty 20.6
blank 20.6
paper 19.6
stamp 19.3
mail 18.2
wall 18
post 17.2
postage 16.7
design 16.3
aged 16.3
border 16.3
brown 15.5
decoration 15.3
purse 15.2
postmark 14.8
postal 14.7
chalkboard 14.7
bag 14.5
symbol 14.1
black 13.8
letter 13.8
gold 13.2
finance 12.7
board 12.7
decorative 12.5
icon 11.9
business 11.5
binder 11.4
textured 11.4
protective covering 11
ornate 11
blackboard 10.9
wood 10
painting 9.9
covering 9.8
history 9.8
museum 9.7
wooden 9.7
baby 9.7
metal 9.7
text 9.6
pattern 9.6
concrete 9.6
golden 9.5
money 9.4
3d 9.3
page 9.3
cover 9.3
communication 9.2
historic 9.2
technology 8.9
philately 8.9
chalk 8.8
closeup 8.8
close 8.6
wallpaper 8.4
rough 8.2
style 8.2
binding 8.2
fetus 8.2
backgrounds 8.1
collection 8.1
material 8.1
object 8.1
interior 8
philatelic 7.9
education 7.8
ornament 7.8
space 7.8
rustic 7.7
obsolete 7.7
worn 7.6
rusty 7.6
old fashioned 7.6
grungy 7.6
display 7.4
backdrop 7.4
central processing unit 7.3
success 7.2
dirty 7.2
portrait 7.1
information 7.1
book 7

Google
created on 2022-01-29

Brown 98
Picture frame 96.8
Rectangle 87.8
Art 83.7
Event 66.4
Metal 66.1
Vintage clothing 65.8
Visual arts 65.4
Interior design 63.1
Antique 62.4
Stock photography 62.4
Oval 59.6
Circle 58.6
Child 58.5
Painting 54.8
Collectable 52.9
History 51.9

Microsoft
created on 2022-01-29

human face 98.3
person 95.2
text 91.8
painting 84.8
picture frame 79.7
baby 78.1
clothing 73.6
old 59.8
case 58.3
accessory 30.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-12
Gender Female, 99.9%
Calm 88.4%
Sad 11%
Angry 0.2%
Fear 0.1%
Disgusted 0.1%
Happy 0.1%
Surprised 0.1%
Confused 0%

AWS Rekognition

Age 22-30
Gender Female, 99.7%
Calm 57%
Sad 35.7%
Fear 2.7%
Confused 2.3%
Surprised 1%
Disgusted 0.5%
Angry 0.5%
Happy 0.4%

Microsoft Cognitive Services

Age 39
Gender Female

Microsoft Cognitive Services

Age 13
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Painting 94.4%

Captions

Microsoft

an old photo of a person 55.8%
old photo of a person 50.5%
a clock on the wall 38.3%

Text analysis

Google

000000000000
000000000000