Human Generated Data

Title

Sarah Jane Manship

Date

1930

People

Artist: Paul Manship, American 1885 - 1966

Classification

Sculpture

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Grenville L. Winthrop, 1943.1035

Human Generated Data

Title

Sarah Jane Manship

People

Artist: Paul Manship, American 1885 - 1966

Date

1930

Classification

Sculpture

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Grenville L. Winthrop, 1943.1035

Machine Generated Data

Tags

Amazon
created on 2022-06-17

Figurine 92.7
Person 88.5
Human 88.5
Sculpture 79.4
Art 79.4
Statue 70.8
Head 61.9
Dish 56.2
Food 56.2
Meal 56.2

Clarifai
created on 2023-10-29

people 99.6
portrait 98.9
monochrome 98.8
baby 98.2
child 98.1
one 97.5
man 97.4
son 97.2
art 96.5
doll 95.5
sculpture 94.7
statue 93.7
black and white 93.2
sepia 89.5
mannequin 87.7
wear 87
adult 86.9
facial expression 83.4
girl 81.4
retro 79.4

Imagga
created on 2022-06-17

diaper 69.9
child 62.5
clothing 58.3
baby 51.5
garment 49.1
consumer goods 39.6
covering 39.3
kid 38.1
cute 36.6
knee pad 36.3
boy 33.1
little 32.7
childhood 31.4
blond 29.4
protective garment 29
toddler 28.7
happy 28.2
infant 27
portrait 24.6
adorable 24
person 23.6
smiling 23.2
doll 22.8
son 22.2
face 21.3
cheerful 21.2
happiness 20.4
toilet 20.1
commodity 19.6
fun 19.5
innocence 19.3
plaything 19.1
expression 18.8
studio 18.3
human 18
people 17.3
innocent 16.5
smile 15
looking 14.4
sweet 14.2
joy 14.2
male 14
eyes 13.8
funny 13.8
youth 13.6
newborn 13.6
skin 13.6
pretty 13.3
care 12.4
playful 12.3
hand 12.2
play 12.1
children 11.9
family 11.6
healthy 11.3
kids 11.3
hair 11.1
toy 11.1
life 11
playing 11
lovely 10.7
mother 10.6
daughter 10.5
sitting 10.3
love 10.3
lifestyle 10.1
adult 9.7
vertical 9.6
body 9.6
standing 9.6
joyful 9.2
girls 9.1
black 9
one 9
parent 8.8
hands 8.7
model 8.6
attractive 8.4
health 8.3
fashion 8.3
man 8.1
indoors 7.9
arms 7.7
hygiene 7.6
head 7.6
positive 7.4
exercise 7.3
dress 7.2
home 7.2
activity 7.2
look 7

Google
created on 2022-06-17

Microsoft
created on 2022-06-17

wall 99.8
baby 97.5
indoor 96.8
toddler 96.5
text 85.6
black and white 82
human face 79.4
child 60.3
toilet 43.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 0-3
Gender Male, 98.7%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0%
Angry 0%
Disgusted 0%
Confused 0%

Microsoft Cognitive Services

Age 0
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 88.5%

Categories

Imagga

paintings art 92.1%
people portraits 7.2%

Captions

Microsoft
created on 2022-06-17

a person holding a baby 30.9%
a close up of a baby 30.8%