Human Generated Data

Title

Untitled (possibly James Jennison, Tutor in History, Instructor in Elocution and Registrar, Harvard University)

Date

c. 1858

People

Artist: John Adams Whipple, American 1822 - 1891

Sitter: James Jennison,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Arthur S. Eldredge, 2.2002.2084

Human Generated Data

Title

Untitled (possibly James Jennison, Tutor in History, Instructor in Elocution and Registrar, Harvard University)

People

Artist: John Adams Whipple, American 1822 - 1891

Sitter: James Jennison,

Date

c. 1858

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Arthur S. Eldredge, 2.2002.2084

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.5
Human 98.5
Art 94.4
Face 93.9
Drawing 79.3
Clothing 77.7
Apparel 77.7
Painting 76.4
Portrait 72.1
Photography 72.1
Photo 72.1
Head 70.7
Text 65.5
Overcoat 61.2
Coat 61.2

Clarifai
created on 2023-10-25

one 99
portrait 98.9
people 97.4
art 97.3
retro 96.5
no person 96.4
man 95.1
woman 95
adult 94.8
wear 93.9
two 93.3
sepia 90
old 89.3
facial expression 88.3
painting 87.1
vintage 87
indoors 86
wood 83.6
conceptual 82.9
girl 82.1

Imagga
created on 2022-01-08

device 20.7
hat 19
black 15.6
pick 15.3
light bulb 14.4
bell 14.2
drawing 13.2
diagram 12.4
acoustic device 11.7
electric lamp 11.5
symbol 10.8
vintage 10.7
covering 10
art 10
currency 9.9
close 9.7
sombrero 9.5
headdress 9.5
lamp 9.5
money 9.3
clothing 9.2
old 9
texture 9
representation 8.9
signaling device 8.9
sun 8.8
book jacket 8.7
design 8.5
space 8.5
dark 8.3
style 8.1
closeup 8.1
man 8.1
cash 7.3
paper 7.2
fruit 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 98.8
human face 97.7
man 91.6
person 89.8
electronics 71.9
old 61.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-51
Gender Male, 99.5%
Calm 98.3%
Confused 0.4%
Sad 0.4%
Angry 0.3%
Surprised 0.2%
Disgusted 0.2%
Fear 0.1%
Happy 0.1%

Microsoft Cognitive Services

Age 46
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%

Categories

Imagga

paintings art 95.8%
food drinks 1.9%
interior objects 1.2%

Captions

Microsoft
created on 2022-01-08

an old photo of a sign 57.5%
a close up of a sign 57.4%
a close up of a box 55.4%

Text analysis

Amazon

Immison