Human Generated Data

Title

Untitled (men playing baseball)

Date

1947

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21707

Human Generated Data

Title

Untitled (men playing baseball)

People

Artist: John Howell, American active 1930s-1960s

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.8
Human 99.8
Person 99.8
Person 99.8
Person 99.7
Footwear 99.1
Clothing 99.1
Apparel 99.1
Shoe 99.1
Plant 95.9
Grass 95.9
Person 93.7
Tree 88.1
Shoe 82.8
People 78.8
Photography 77.8
Photo 77.8
Vegetation 74.6
Person 72.3
Outdoors 69
Portrait 68.8
Face 68.8
Shorts 65.4
Female 61.2
Field 59.3
Sport 59.2
Sports 59.2
Cricket 56.2
Lawn 55.5
Park 55.5

Imagga
created on 2022-03-11

runner 41
athlete 36.2
sport 30.6
person 27.4
people 22.9
contestant 21
lifestyle 20.2
adult 19.7
silhouette 18.2
sunset 18
beach 17.8
outdoors 17.7
summer 16.7
outdoor 16.1
mechanical device 16
active 15.5
fitness 15.4
posing 15.1
action 14.8
exercise 14.5
swing 14.5
body 14.4
water 14
attractive 14
man 13.5
leisure 13.3
male 13
fashion 12.8
sensuality 12.7
run 12.5
dark 12.5
ocean 12.5
model 12.4
lady 12.2
mechanism 11.9
health 11.8
sky 11.5
vacation 11.5
healthy 11.3
boy 11.3
sun 11.3
sexy 11.2
hair 11.1
portrait 11
child 10.9
dress 10.8
recreation 10.8
running 10.6
couple 10.5
forest 10.4
walking 10.4
legs 10.4
sports equipment 10.1
travel 9.9
pretty 9.8
human 9.8
sand 9.5
happy 9.4
plaything 9.3
street 9.2
ball 9.2
world 8.7
walk 8.6
outside 8.6
dancer 8.4
sprinkler 8.3
park 8.2
fun 8.2
one 8.2
teenager 8.2
style 8.2
light 8.1
wall 8.1
wet 8.1
women 7.9
sea 7.8
black 7.8
exercising 7.7
free 7.5
sidewalk 7.4
slim 7.4
pose 7.3
road 7.2
grass 7.1
country 7

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

outdoor 98.1
black and white 93.4
footwear 86
text 82.1
tree 78.3
person 67.2
monochrome 67.1
clothing 61.7
people 58.8
man 54.5
street 52

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Female, 53.2%
Fear 88.5%
Calm 3%
Happy 1.9%
Sad 1.8%
Surprised 1.8%
Angry 1.7%
Disgusted 0.7%
Confused 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 99.1%

Captions

Microsoft

a group of people walking down the street 94.1%
a group of people walking down a street 93.9%
a group of people walking on a street 91.6%

Text analysis

Amazon

YT33A2
MJI3 YT33A2 032NA
032NA
MJI3

Google

MJ13 YT33A2 022MA
022MA
YT33A2
MJ13