Human Generated Data

Title

Untitled (man looking at flowers)

Date

c. 1892-c. 1905

People

Artist: Sarah Choate Sears, American 1858 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Montgomery S. Bradley and Cameron Bradley, P1984.65

Human Generated Data

Title

Untitled (man looking at flowers)

People

Artist: Sarah Choate Sears, American 1858 - 1935

Date

c. 1892-c. 1905

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Montgomery S. Bradley and Cameron Bradley, P1984.65

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 87.4
Painting 83.7
Art 83.7
Bird 77.5
Animal 77.5
Finger 60
Paper 55.2
Person 42.6

Clarifai
created on 2023-10-26

people 99.7
portrait 99.3
wear 98.6
one 98.1
sepia 97.9
man 97.5
retro 96.8
vintage 95.8
art 95.1
two 94.8
adult 94.7
monochrome 94
music 92.5
facial expression 90.7
gritty 90.6
old 90.3
child 89
square 87.9
side view 87.3
sepia pigment 86.9

Imagga
created on 2022-01-23

book jacket 74.2
jacket 58.7
wrapping 43.8
covering 37.8
binding 36.4
texture 34.1
grunge 29.8
jersey 27.3
vintage 26.5
paper 25.3
old 24.4
blackboard 21.8
black 21.7
frame 21.7
shirt 21.6
blank 21.4
chalkboard 20.6
chalk 20.5
textured 20.2
antique 18.2
pattern 17.8
design 17.4
retro 17.2
empty 17.2
garment 16.5
envelope 15.8
nobody 15.6
art 15
dirty 14.5
board 14.4
space 14
note 13.8
wall 13.7
aged 13.6
sign 13.5
wallpaper 13
page 13
message 12.8
brown 12.5
material 12.5
ancient 12.1
border 11.8
container 11.2
card 11.1
parchment 10.6
write 10.4
clothing 10
backdrop 9.9
school 9.9
notice 9.7
text 9.6
symbol 9.4
drawing 9.2
close 9.1
business 9.1
currency 9
reminder 8.7
announcement 8.7
worn 8.6
grungy 8.5
canvas 8.5
money 8.5
rough 8.2
copy space 8.1
backgrounds 8.1
copy 8
surface 7.9
bill 7.6
sheet 7.5
decorative 7.5
wood 7.5
closeup 7.4
banking 7.4
yellow 7.3
book 7.3
detail 7.2
bank 7.2
wooden 7
eraser 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 97.3
old 45.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 99.6%
Calm 85.1%
Sad 10.9%
Confused 1%
Happy 0.7%
Fear 0.6%
Disgusted 0.6%
Angry 0.5%
Surprised 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 83.7%
Bird 77.5%
Person 42.6%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2022-01-23

an old photo of a person 67.7%
old photo of a person 64.5%
a person posing for a photo 62.8%