Human Generated Data

Title

Peasant man

Date

19th century

People
Classification

Sculpture

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Estate of Richard Currier, 1964.64.29

Human Generated Data

Title

Peasant man

People
Date

19th century

Classification

Sculpture

Machine Generated Data

Tags

Amazon
created on 2022-02-19

Person 93.6
Human 93.6
Symbol 82.4
Arrow 82.4
Person 80
Cushion 71
Finger 65.4
Clothing 59.6
Apparel 59.6
Shoe 59.6
Footwear 59.6

Imagga
created on 2022-02-19

holster 100
sheath 100
protective covering 75.6
covering 55.6
splint 16.6
body 15.2
mechanical device 13.3
brown 13.3
fashion 12.8
business 12.8
texture 12.5
leather 12.3
close 12
money 11.9
device 11.8
healthy 11.3
style 11.1
currency 10.8
hand 10.6
wooden 10.5
health 10.4
paper 10.2
man 10.1
bag 9.9
mechanism 9.9
closeup 9.4
footwear 9.4
wood 9.3
old 9.1
object 8.8
home 8.8
male 8.5
finance 8.4
black 8.4
tool 8.3
cash 8.2
weapon 8
hands 7.8
equipment 7.8
construction 7.7
skin 7.6
legs 7.5
lying 7.5
slim 7.4
fit 7.4
shopping 7.3
food 7.3
metal 7.2
detail 7.2
color 7.2
sexy 7.2
bank 7.2
shoe 7.2
financial 7.1
adult 7.1

Google
created on 2022-02-19

Joint 97.5
Hand 96.1
Human body 88.5
Sleeve 87.2
Wood 85.7
Gesture 85.3
Finger 83.5
Beige 82.7
Wrist 77.9
Creative arts 77.5
Thumb 73.9
Personal protective equipment 73.1
Foot 72.6
Human leg 70.2
Stuffed toy 70.1
Toy 69.2
Fashion accessory 67.4
Elbow 66.6
Working animal 66.3
Magenta 65.1

Microsoft
created on 2022-02-19

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 81.2%
Fear 32.8%
Surprised 29.2%
Happy 11.5%
Sad 9.4%
Angry 5.9%
Calm 4.2%
Confused 3.7%
Disgusted 3.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 93.6%
Shoe 59.6%