Human Generated Data

Title

Untitled (mens basketball team, 1927-1928)

Date

c. 1928, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5742

Human Generated Data

Title

Untitled (mens basketball team, 1927-1928)

People

Artist: Durette Studio, American 20th century

Date

c. 1928, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5742

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.9
Person 99.9
Person 99.6
Person 99.6
Person 99.2
Person 99.2
Person 99.2
Person 98.4
Person 98.4
Person 97.9
Person 91.9
Sports 90.8
Sport 90.8
Apparel 89.3
Clothing 89.3
People 87.8
Shorts 81.2
Shoe 80.2
Footwear 80.2
Chair 77.5
Furniture 77.5
Shoe 76.4
Team 66.3
Team Sport 62.1
Performer 57.2

Clarifai
created on 2019-11-16

people 99.7
group together 99.3
adult 96.9
athlete 96.6
many 96.3
portrait 94.3
outfit 93.7
monochrome 92.9
man 92
uniform 90
woman 89.6
competition 89.4
group 87.1
wear 86.9
sports equipment 85.7
action energy 85
sport 84.7
collage 84.2
recreation 80.9
action 80.5

Imagga
created on 2019-11-16

football helmet 61.1
helmet 49.5
headdress 45.5
clothing 40.9
person 35.4
black 34.4
body 32
people 31.8
model 31.1
sexy 29.7
adult 27.9
attractive 25.2
athlete 24.6
male 24.1
man 22.8
dark 20.9
covering 19.2
silhouette 19
studio 19
dance 18.9
portrait 18.8
consumer goods 18.2
runner 18.1
fashion 18.1
posing 17.8
back 17.1
hair 16.6
one 16.4
skin 16.1
human 15.7
style 15.6
women 15
dancer 14.9
bathing cap 14.8
art 14.6
erotic 14.2
couple 13.9
lifestyle 13.7
sensuality 13.6
pose 13.6
contestant 12.9
expression 12.8
face 12.8
sensual 12.7
fitness 12.6
pretty 12.6
passion 12.2
lady 12.2
love 11.8
sport 11.8
dress 11.7
naked 11.6
muscular 11.5
cap 11.4
group 11.3
men 11.2
slim 11
nude 10.7
emotion 10.1
fit 10.1
elegance 10.1
exercise 10
dancing 9.6
elegant 9.4
happy 9.4
strong 9.4
cute 9.3
performer 8.9
muscle 8.7
athletic 8.6
party 8.6
youth 8.5
head 8.4
health 8.3
training 8.3
fun 8.2
teenager 8.2
healthy 8.2
night 8
happiness 7.8
artistic 7.8
sitting 7.7
sexual 7.7
two 7.6
hand 7.6
20s 7.3
sunset 7.2
ball 7.1
maillot 7.1
boy 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 13-25
Gender Male, 98.1%
Fear 0%
Surprised 0%
Angry 0.3%
Calm 88.4%
Sad 10.7%
Happy 0.1%
Disgusted 0%
Confused 0.4%

AWS Rekognition

Age 18-30
Gender Male, 78.7%
Surprised 0.2%
Sad 0.6%
Confused 0.1%
Happy 0.3%
Disgusted 0.1%
Fear 0%
Angry 0.4%
Calm 98.1%

AWS Rekognition

Age 16-28
Gender Male, 54.5%
Fear 45%
Sad 45.1%
Surprised 45%
Angry 45.1%
Disgusted 45%
Confused 45%
Happy 45%
Calm 54.7%

AWS Rekognition

Age 13-25
Gender Male, 54.5%
Angry 45%
Calm 55%
Happy 45%
Sad 45%
Fear 45%
Surprised 45%
Disgusted 45%
Confused 45%

AWS Rekognition

Age 18-30
Gender Female, 52.7%
Disgusted 45%
Calm 54.9%
Angry 45%
Confused 45%
Fear 45%
Sad 45%
Surprised 45%
Happy 45%

AWS Rekognition

Age 13-23
Gender Male, 53%
Angry 45.1%
Calm 51.5%
Sad 48.3%
Happy 45%
Fear 45.1%
Confused 45%
Surprised 45%
Disgusted 45%

AWS Rekognition

Age 22-34
Gender Male, 54.9%
Disgusted 45%
Happy 45%
Surprised 45%
Sad 45.1%
Fear 45%
Calm 54.9%
Angry 45%
Confused 45%

AWS Rekognition

Age 27-43
Gender Male, 54.4%
Surprised 45%
Confused 45%
Sad 45.6%
Disgusted 45%
Angry 45.1%
Calm 51.2%
Fear 45%
Happy 48%

AWS Rekognition

Age 19-31
Gender Male, 54.6%
Angry 45.1%
Sad 45.4%
Confused 45%
Calm 54.5%
Surprised 45%
Happy 45%
Disgusted 45%
Fear 45%

AWS Rekognition

Age 25-39
Gender Male, 54.9%
Calm 52.7%
Disgusted 45%
Happy 47%
Fear 45%
Confused 45%
Angry 45.1%
Surprised 45%
Sad 45.1%

Microsoft Cognitive Services

Age 31
Gender Male

Microsoft Cognitive Services

Age 21
Gender Male

Microsoft Cognitive Services

Age 25
Gender Male

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 24
Gender Male

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 28
Gender Male

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 27
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.9%
Person 99.6%
Person 99.6%
Person 99.2%
Person 99.2%
Person 99.2%
Person 98.4%
Person 98.4%
Person 97.9%
Person 91.9%
Shoe 80.2%
Shoe 76.4%

Categories

Captions

Google Gemini

Created by gemini-2.0-flash-lite on 2025-05-11

Here is a description of the image:

The image is a vintage black and white photograph of a basketball team. The team members are comprised of men dressed in athletic uniforms. The team is divided into two rows: a seated row in the front and a standing row in the back. Each player is wearing a sleeveless tank top and athletic shorts. The tank tops are adorned with a large letter "D". The front row is seated with their legs crossed. They are wearing knee pads and athletic shoes. There is a basketball in front of the seated players, with the words "CD 27-28" printed on it. The standing row is dressed identically in tank tops and shorts. Standing behind the team is a man wearing a suit and tie. The background is a dark, indistinct space that might depict a studio or gymnasium setting. The overall style of the photograph suggests it was taken in the early 20th century.

Created by gemini-2.0-flash on 2025-05-11

Here is a description of the image:

The image is a vintage black and white photograph of a basketball team from what appears to be the early 20th century. The team is composed of nine young men. Five players are seated in the front row and four are standing in the back. Each player is wearing a sleeveless tank top with a large letter, either "L" or "D", on the front, along with shorts. The seated players also have pads around their knees. They all have their arms crossed. Behind the team, there is a painted backdrop of some kind of scenery. In the center of the back row stands a man in a dark suit and tie. In front of the seated row is a basketball with the letters "CD" and the numbers "27-28" written on it. The photograph has a classic, formal team portrait aesthetic.

Text analysis

Amazon

D
D D
I
CD

Google

CD 37-2
CD
37-2