Human Generated Data

Title

Untitled (little girl leaning over with hands on knees, looking at carpet)

Date

c. 1955

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12588

Human Generated Data

Title

Untitled (little girl leaning over with hands on knees, looking at carpet)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Human 98.9
Person 98.9
Apparel 97.5
Clothing 97.5
Shoe 94.7
Footwear 94.7
Person 92.8
Indoors 81.8
Female 76.6
Person 72.2
Room 71.8
Photo 69
Photography 69
People 68.3
Photographer 67.3
Shoe 62.2
Face 60.5
Portrait 60.5
Shorts 59.6
Furniture 59.4
Floor 58.2
Costume 58
Girl 57.5
Play 56.2
Woman 55.6
Shoe 50.9

Imagga
created on 2022-02-04

crutch 47.3
staff 37.5
stick 33.8
man 30.9
sport 28.4
people 23.4
male 22.8
person 21.9
active 18.1
leisure 17.4
playing 17.3
golf 17.2
outdoors 17.2
adult 16.9
child 16.1
ball 15.8
course 15.6
play 15.5
game 15.1
golfer 15
fun 15
exercise 14.5
grass 13.4
senior 13.1
standing 13
competition 12.8
recreation 12.5
businessman 12.4
player 12.3
musical instrument 12.2
men 12
lifestyle 10.8
wind instrument 10.6
flag 10.6
success 10.5
old 10.4
walking 10.4
business 10.3
action 10.2
happy 10
brass 9.9
activity 9.8
putt 9.8
retirement 9.6
women 9.5
club 9.4
day 9.4
outside 9.4
joy 9.2
pretty 9.1
fitness 9
summer 9
world 8.9
trombone 8.9
putting 8.8
retired 8.7
couple 8.7
boy 8.7
happiness 8.6
hole 8.6
portrait 8.4
holding 8.2
professional 8.2
fairway 7.8
work 7.8
golfing 7.8
black 7.8
model 7.8
full length 7.8
run 7.7
elderly 7.7
outdoor 7.6
walk 7.6
fashion 7.5
enjoy 7.5
human 7.5
one 7.5
park 7.4
sports 7.4
vacation 7.4
life 7.3
swing 7.3
chair 7.2
copy space 7.2
suit 7.2
childhood 7.2
handsome 7.1
smile 7.1
family 7.1
worker 7.1
kid 7.1

Google
created on 2022-02-04

Microsoft
created on 2022-02-04

text 97.4
clothing 88.6
footwear 87.2
person 76.2
dress 74.4
drawing 66.1
woman 65.6
cartoon 53.4
posing 46.4

Face analysis

Amazon

AWS Rekognition

Age 6-16
Gender Male, 96.8%
Calm 56.4%
Sad 42.3%
Confused 0.6%
Happy 0.2%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%

Feature analysis

Amazon

Person 98.9%
Shoe 94.7%

Text analysis

Amazon

8
YT37A2
MJIR YT37A2
MJIR