Human Generated Data

Title

"Crushed Lafayette...Take That Old Boy"

Date

c. 1834

People

Artist: Honoré-Victorin Daumier, French 1808-1879

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, The Program for Harvard College, M13714

Human Generated Data

Title

"Crushed Lafayette...Take That Old Boy"

People

Artist: Honoré-Victorin Daumier, French 1808-1879

Date

c. 1834

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, The Program for Harvard College, M13714

Machine Generated Data

Tags

Amazon
created on 2023-08-30

Photography 99.8
Clothing 99.7
Hat 99.7
Adult 99.6
Male 99.6
Man 99.6
Person 99.6
Art 98.5
Person 94.1
Person 91.8
Painting 87.6
Drawing 86
Person 85.8
Person 84.3
Face 80.1
Head 80.1
Person 79.1
Person 70.2
Person 69.5
Person 67.8
Portrait 64
Person 57.2
Coat 57.2
Hugging 55.1

Clarifai
created on 2023-11-01

people 99.9
street 99.6
one 98.5
adult 98.1
art 97
portrait 95.7
monochrome 94.3
man 94
analogue 92.7
wear 91.7
sadness 91.2
merchant 87.6
war 87.2
two 86.9
calamity 85.7
administration 85.6
child 85.4
despair 85
black and white 84.5
mono 84.2

Imagga
created on 2018-12-20

man 33.6
male 24.9
people 24
person 22.8
sport 19.1
outdoor 17.6
outdoors 17.3
adult 16.2
active 15.3
snow 14.8
park 14
leisure 13.3
mask 12.9
winter 12.8
sky 12.8
walk 12.4
adventure 12.3
walking 12.3
street 12
landscape 11.9
silhouette 11.6
lifestyle 11.6
cold 11.2
sitting 11.2
outside 11.1
mountain 10.7
scholar 10.7
looking 10.4
newspaper 10.3
city 10
recreation 9.9
vacation 9.8
intellectual 9.5
men 9.4
covering 9.2
backpack 9.2
black 9.2
suit 9.2
travel 9.2
alone 9.1
exercise 9.1
fun 9
one 9
hat 8.9
climb 8.8
solitude 8.7
extreme 8.6
season 8.6
child 8.4
old 8.4
product 8.2
happy 8.1
activity 8.1
helmet 8
working 8
holiday 7.9
summer 7.7
hiking 7.7
free 7.5
senior 7.5
top 7.4
freedom 7.3
business 7.3
sunset 7.2
building 7.2
portrait 7.1
grass 7.1

Google
created on 2018-12-20

Microsoft
created on 2018-12-20

text 98.7
book 90.7
person 90
snow 36
black and white 28.3
street 25
standing 12.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-26
Gender Male, 99.9%
Confused 67%
Sad 34.4%
Fear 8.5%
Surprised 7.6%
Calm 1%
Disgusted 0.7%
Angry 0.4%
Happy 0.4%

AWS Rekognition

Age 30-40
Gender Male, 99.4%
Happy 38%
Calm 33%
Fear 22.2%
Surprised 6.6%
Sad 3.1%
Angry 1.3%
Disgusted 0.9%
Confused 0.6%

AWS Rekognition

Age 23-33
Gender Male, 98%
Calm 80.3%
Angry 9.1%
Surprised 6.8%
Fear 5.9%
Sad 5.7%
Confused 1%
Happy 0.8%
Disgusted 0.6%

AWS Rekognition

Age 13-21
Gender Male, 94.3%
Calm 78.6%
Fear 16.6%
Surprised 6.4%
Sad 2.6%
Confused 0.7%
Happy 0.7%
Angry 0.5%
Disgusted 0.4%

AWS Rekognition

Age 18-24
Gender Male, 95.5%
Sad 97.4%
Calm 20.6%
Fear 16.2%
Surprised 7.5%
Angry 5.1%
Disgusted 1.9%
Happy 0.5%
Confused 0.4%

AWS Rekognition

Age 29-39
Gender Male, 96.2%
Calm 46.5%
Surprised 38%
Happy 19.3%
Fear 6.4%
Sad 2.9%
Disgusted 2.1%
Confused 1.4%
Angry 1.3%

Feature analysis

Amazon

Adult 99.6%
Male 99.6%
Man 99.6%
Person 99.6%

Captions

Microsoft
created on 2018-12-20

an old photo of a man 74.7%
a black and white photo of a man 66%
a man holding a book 25.9%

Text analysis

Amazon

de
-
Lithe de
Lithe
On -
On