Human Generated Data

Title

Untitled (three children playing on toy horse)

Date

c. 1950

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21690

Human Generated Data

Title

Untitled (three children playing on toy horse)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Human 99.6
Person 99.6
Clothing 98.8
Apparel 98.8
Person 98.3
Plant 94.3
Grass 94.3
Outdoors 93.8
Nature 91.8
Person 91.5
Hand 79.5
People 79.5
Shorts 75.7
Dress 71.9
Female 71.6
Yard 69.3
Play 68.1
Kid 67.9
Child 67.9
Face 66.6
Field 66.5
Suit 64.8
Coat 64.8
Overcoat 64.8
Sports 62.7
Sport 62.7
Photography 62.7
Photo 62.7
Girl 60.4
Boy 59.5
Path 59.3
Helmet 58.8
Park 57.6
Lawn 57.6
Pants 55.5

Imagga
created on 2022-03-11

sport 36.3
man 26.9
outdoors 26.5
active 23.5
person 23.4
people 21.8
male 21.4
outdoor 20.6
adult 19.7
beach 19.4
ball 18.4
competition 18.3
child 17.9
grass 17.4
action 16.7
leisure 16.6
exercise 16.3
fun 15.7
boy 15.6
run 15.4
running 15.3
game 15.2
summer 14.8
player 14.7
planner 14.6
playing 13.7
fitness 13.6
athlete 13.3
lifestyle 13
sports 12.9
recreation 12.6
field 12.6
activity 12.5
vacation 12.3
play 12.1
outside 12
sand 11.9
day 11.8
goal 11.5
travel 11.3
sky 10.8
sunset 10.8
soccer 10.6
football 10.6
men 10.3
clothing 10.2
horse 10.1
team 9.9
practice 9.7
couple 9.6
healthy 9.4
ocean 9.3
body 8.8
happy 8.8
track 8.7
athletic 8.6
relax 8.4
joy 8.4
health 8.3
success 8
mountain 8
equipment 7.9
kick 7.8
motion 7.7
two 7.6
hand 7.6
walking 7.6
legs 7.6
relaxation 7.5
one 7.5
water 7.3
dress 7.2
family 7.1
guy 7.1
sea 7
together 7

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

outdoor 97.2
dance 92.1
text 89.3
clothing 85.6
person 84
black and white 82.4

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 67.2%
Happy 46.7%
Calm 38.4%
Sad 10.6%
Angry 2.1%
Fear 0.7%
Disgusted 0.7%
Surprised 0.5%
Confused 0.3%

AWS Rekognition

Age 6-14
Gender Male, 60.6%
Happy 54.8%
Angry 15.1%
Sad 11.4%
Fear 6.9%
Surprised 5.7%
Disgusted 3.3%
Calm 2.7%
Confused 0.2%

AWS Rekognition

Age 12-20
Gender Male, 98.7%
Happy 64.2%
Calm 9.5%
Surprised 9.1%
Fear 7.5%
Sad 4.3%
Angry 2.7%
Disgusted 2.3%
Confused 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Text analysis

Amazon

10A3.
10A3. د. KODAKSRA
د.
KODAKSRA

Google

のや。 VTヨヨA2-MAaOx voY ....
ヨヨ
A2
VT
-
....
MAaOx
voY