Human Generated Data

Title

Untitled (re-photographed 19th century image of mother and three children)

Date

1900-1950

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9064

Human Generated Data

Title

Untitled (re-photographed 19th century image of mother and three children)

People

Artist: Martin Schweig, American 20th century

Date

1900-1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9064

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99
Human 99
Person 98.8
Helmet 96.8
Clothing 96.8
Apparel 96.8
Person 95.3
Helmet 89.6
Person 85.3
Helmet 83
Art 71.1
Helmet 69.5
Drawing 65.1

Clarifai
created on 2023-10-26

people 99.8
group 98.6
one 97.8
leader 97.6
adult 97.3
man 96.5
portrait 92.6
five 92
two 91.7
wear 90.6
retro 87.4
vintage 87.3
woman 86.2
monarch 85.7
art 85.2
veil 83.8
old 83.7
three 81.8
antique 81
many 79.8

Imagga
created on 2022-01-23

tray 100
receptacle 100
container 100
vintage 33.1
grunge 31.5
old 31.3
frame 31.1
retro 28.7
texture 23.6
paper 21.2
art 19.5
blank 18.8
border 18.1
black 18
chalkboard 17.6
aged 17.2
board 15.4
dirty 15.4
empty 14.6
design 14.1
business 14
blackboard 13.7
textured 13.1
chalk 12.7
space 12.4
symbol 12.1
antique 12.1
page 12.1
wall 12
card 11.9
rough 11.8
purse 11.8
bag 11.5
mail 11.5
rusty 11.4
grungy 11.4
pattern 10.9
album 10.7
box 10.6
money 10.2
finance 10.1
wallpaper 9.9
school 9.9
postage 9.8
postal 9.8
scrapbook 9.7
technology 9.6
text 9.6
brown 9.6
education 9.5
ancient 9.5
communication 9.2
note 9.2
decorative 9.2
letter 9.2
message 9.1
backdrop 9.1
paint 9
material 8.9
backgrounds 8.9
object 8.8
icon 8.7
stamp 8.7
damaged 8.6
post 8.6
close 8.6
photography 8.5
3d 8.5
learn 8.5
study 8.4
cover 8.3
closeup 8.1
currency 8.1
graphic 8
pencil box 8
invitation 7.7
sign 7.5
sheet 7.5
wood 7.5
greeting 7.4
style 7.4
cash 7.3
metal 7.2
detail 7.2
square 7.2
decoration 7.2
idea 7.1
surface 7
wooden 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.8
drawing 97.6
sketch 95.4
old 94.2
indoor 90
person 88.9
clothing 84.6
black 78.5
human face 74.9
painting 72.4
white 69.2
cartoon 59.4
picture frame 53.8
vintage 49

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 27-37
Gender Male, 75.3%
Calm 59.8%
Surprised 35%
Happy 2.3%
Fear 1.8%
Sad 0.4%
Angry 0.3%
Disgusted 0.2%
Confused 0.1%

AWS Rekognition

Age 24-34
Gender Male, 89.9%
Calm 99.3%
Surprised 0.6%
Angry 0%
Fear 0%
Happy 0%
Sad 0%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 26-36
Gender Male, 54.1%
Calm 88.2%
Fear 8.8%
Surprised 1.4%
Happy 1%
Sad 0.2%
Disgusted 0.2%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 29-39
Gender Female, 98.4%
Surprised 69.5%
Calm 23.5%
Fear 3.9%
Happy 1.2%
Sad 0.7%
Disgusted 0.5%
Angry 0.4%
Confused 0.2%

Feature analysis

Amazon

Person 99%
Helmet 96.8%

Captions

Microsoft
created on 2022-01-23

a vintage photo of a person 77.8%
an old photo of a person 77.1%
a vintage photo of a box 66%

Text analysis

Amazon

EYE
EYE WYNNRKOA
WYNNRKOA