Human Generated Data

Title

Untitled (basketball players playing on court; shooting a basket, Harlem Globe Trotters)

Date

c. 1947

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14405

Human Generated Data

Title

Untitled (basketball players playing on court; shooting a basket, Harlem Globe Trotters)

People

Artist: Jack Gould, American

Date

c. 1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.6
Human 99.6
People 99.5
Person 98.5
Shorts 97.6
Apparel 97.6
Clothing 97.6
Sport 97.4
Team 97.4
Sports 97.4
Team Sport 97.4
Person 96.6
Person 95.6
Shoe 95.6
Footwear 95.6
Person 91.1
Basketball 87.6
Sphere 83.4
Person 79.8
Basketball Court 74.7
Nature 72
Ball 69
Outdoors 64.9
Sand 57

Imagga
created on 2022-01-29

silhouette 36.4
man 26.9
sport 25.9
people 22.9
male 19.2
sax 18.1
active 18
person 17
wheeled vehicle 16.9
sunset 16.2
ski 15.6
unicycle 15.6
recreation 14.3
exercise 13.6
snow 13.6
fun 13.5
black 13.2
men 12.9
motion 12.8
winter 12.8
sky 12.8
activity 12.5
outdoor 12.2
action 12.1
vehicle 12
jump 11.5
sun 11.3
wind instrument 11.2
outdoors 11.2
fitness 10.8
slope 10.3
business 10.3
stick 10.3
drawing 10.2
lifestyle 10.1
hockey stick 10
mountain 9.9
dancer 9.8
skier 9.8
musical instrument 9.7
conveyance 9.7
group 9.7
landscape 9.7
jumping 9.7
extreme 9.6
athlete 9.6
boy 9.6
play 9.5
equipment 9.4
sports 9.2
competition 9.2
city 9.1
performer 9.1
ski slope 9
team 9
guitar 8.8
beach 8.7
athletic 8.6
ball 8.5
travel 8.5
leisure 8.3
speed 8.2
dance 8.1
success 8
businessman 7.9
adult 7.7
run 7.7
outside 7.7
sports equipment 7.7
adventure 7.6
walking 7.6
movement 7.5
sketch 7.5
reflection 7.5
water 7.3
music 7.3
office 7.2
game 7.1
river 7.1
day 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

basketball 89.6
text 87.3
person 84.2
footwear 75.2
group 61.6
man 55.1

Face analysis

Amazon

Google

AWS Rekognition

Age 11-19
Gender Female, 99.1%
Happy 49.1%
Calm 44.8%
Angry 2%
Sad 1.7%
Disgusted 0.9%
Surprised 0.6%
Fear 0.6%
Confused 0.3%

AWS Rekognition

Age 24-34
Gender Male, 97.1%
Calm 64.3%
Happy 17.3%
Fear 10.2%
Disgusted 3%
Angry 1.9%
Confused 1.5%
Surprised 1.1%
Sad 0.7%

AWS Rekognition

Age 20-28
Gender Female, 98.9%
Calm 65.9%
Surprised 14.1%
Sad 4.3%
Angry 4.2%
Fear 4%
Disgusted 3.6%
Confused 2%
Happy 2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 95.6%

Captions

Microsoft

a group of people posing for a photo 92.3%
a group of people posing for a picture 92.2%
a group of people posing for the camera 92.1%

Text analysis

Amazon

15
48
HARLE
ниссо
SKOKE
ниссо ЫГW
ЫГW

Google

MJI7
YT3RA2
MJI7 YT3RA2