Human Generated Data

Title

Untitled (Wethersfield High School boys basketball team)

Date

1948

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19364

Human Generated Data

Title

Untitled (Wethersfield High School boys basketball team)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.9
Apparel 99.9
Human 99.8
Person 99.8
Person 99.8
Person 99.6
Person 99.2
Shoe 98.9
Footwear 98.9
Person 98.6
Person 98.6
Person 98.5
Person 98.4
Person 98.4
Shoe 98.2
Shoe 98
Shoe 96
Shorts 95.2
Shoe 87.4
Shoe 86.6
Shoe 84.2
People 77.2
Female 76.6
Floor 73
Girl 64.4
Shoe 64
Face 61.1
Boot 61
Shoe 59.7
Costume 59
Flooring 58
Curtain 57
Photography 55.2
Portrait 55.2
Photo 55.2

Imagga
created on 2022-03-05

people 34
group 29.8
men 27.5
crowd 25.9
person 24.7
man 23.5
silhouette 23.2
male 21.3
classroom 19.4
room 19.3
women 19
adult 18.2
business 17.6
walking 16.1
sport 15.8
world 15.1
businessman 15
gymnasium 14.9
outdoors 14.9
spectator 13.8
lifestyle 13
team 12.5
athletic facility 12.1
black 12.1
travel 12
happy 11.9
city 11.6
run 11.6
fun 11.2
life 11.2
active 11
athlete 10.6
together 10.5
couple 10.4
friends 10.3
summer 10.3
work 10.2
speed 10.1
competition 10.1
exercise 10
ball 9.9
lady 9.7
silhouettes 9.7
success 9.7
body 9.6
boy 9.6
walk 9.5
legs 9.4
motion 9.4
teamwork 9.3
action 9.3
dark 9.2
outdoor 9.2
portrait 9.1
facility 9
human 9
sunset 9
graphic 8.7
urban 8.7
running 8.6
building 8.6
outline 8.5
females 8.5
old 8.4
dancer 8.3
fashion 8.3
girls 8.2
runner 8.2
office 8.2
copy space 8.1
recreation 8.1
smiling 8
dance 7.8
art 7.7
sky 7.6
casual 7.6
beach 7.6
child 7.3
back 7.3
performer 7.3
sun 7.2
game equipment 7.2

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 96.3
ground 96.2
outdoor 95.7
footwear 94.6
clothing 92.4
group 84.1
man 80
basketball 60.5
woman 53.6
posing 42.6
line 31.9

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 58%
Calm 76.9%
Surprised 6.6%
Sad 5%
Happy 4.9%
Confused 3.4%
Fear 1.4%
Disgusted 1.4%
Angry 0.5%

AWS Rekognition

Age 31-41
Gender Male, 97%
Calm 84.8%
Happy 10.8%
Surprised 1.6%
Sad 1.1%
Angry 0.9%
Disgusted 0.4%
Confused 0.3%
Fear 0.1%

AWS Rekognition

Age 38-46
Gender Male, 99.2%
Sad 50.5%
Calm 24.5%
Happy 9.3%
Surprised 7.7%
Confused 3.5%
Disgusted 2.9%
Angry 1.3%
Fear 0.4%

AWS Rekognition

Age 34-42
Gender Male, 99.1%
Calm 91.7%
Sad 5.2%
Happy 1.1%
Confused 0.7%
Angry 0.4%
Disgusted 0.4%
Surprised 0.4%
Fear 0.1%

AWS Rekognition

Age 41-49
Gender Male, 98.3%
Sad 88.3%
Happy 4.6%
Confused 3%
Calm 1.7%
Disgusted 0.7%
Fear 0.7%
Surprised 0.6%
Angry 0.4%

AWS Rekognition

Age 31-41
Gender Male, 88.2%
Calm 94%
Happy 4.1%
Sad 0.7%
Confused 0.3%
Surprised 0.3%
Disgusted 0.3%
Fear 0.2%
Angry 0.1%

AWS Rekognition

Age 33-41
Gender Male, 98.5%
Calm 44.6%
Sad 22.7%
Surprised 18.1%
Happy 6.5%
Confused 3.4%
Disgusted 2.7%
Angry 1.6%
Fear 0.5%

AWS Rekognition

Age 22-30
Gender Male, 99.5%
Calm 73.3%
Surprised 15.3%
Confused 3.2%
Sad 2.6%
Angry 2.5%
Disgusted 2.1%
Fear 0.5%
Happy 0.4%

AWS Rekognition

Age 36-44
Gender Male, 95.4%
Calm 98.5%
Sad 0.6%
Surprised 0.4%
Angry 0.3%
Disgusted 0.1%
Happy 0.1%
Confused 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 98.9%

Captions

Microsoft

a group of people standing in front of a crowd posing for the camera 92.5%
a group of people posing for the camera 92.4%
a group of people posing for a picture 92.3%

Text analysis

Amazon

W
1948-49
3342-1000

Google

179847
179847