Human Generated Data

Title

Untitled (men on basketball team, group portrait)

Date

1948

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19350

Human Generated Data

Title

Untitled (men on basketball team, group portrait)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19350

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Person 99.6
Person 99.5
Person 99.5
Person 99.4
Person 99.3
Person 99.1
Person 99.1
Person 99.1
Person 98.9
Person 98.8
Sailor Suit 97.8
Clothing 65
Apparel 65
Window 61.7
Prison 60.8
Crowd 58.8

Clarifai
created on 2023-10-22

people 99.6
group together 99.1
many 98.3
adult 96.5
wear 96.2
uniform 96.2
man 95
outfit 94.2
position 92.7
group 89.8
competition 88.2
athlete 86.9
education 86.3
woman 83.8
portrait 81.6
school 78.8
sports equipment 76.5
several 75.7
discipline 75.7
boxer 73.3

Imagga
created on 2022-03-05

crowd 24
group 23.4
center 20.6
people 17.8
team 17
marimba 17
maze 14.6
landscape 14.1
silhouette 14.1
percussion instrument 13.9
person 13.7
business 12.7
man 12.1
men 12
musical instrument 11.6
male 11.3
water 11.3
scene 11.2
baron 10
audience 9.7
businessman 9.7
women 9.5
teamwork 9.3
building 9.1
art 9
outdoors 9
rural 8.8
ice 8.6
sea 8.6
travel 8.4
animal 8.3
sky 8.3
dancer 8.2
snow 8.2
adult 8
rock 7.8
black 7.8
cold 7.7
flag 7.7
motion 7.7
winter 7.7
fun 7.5
icon 7.1
work 7.1

Google
created on 2022-03-05

Sleeve 87.2
Standing 86.4
Gesture 85.3
Font 81.8
Player 73.6
Uniform 68.7
Crew 68.5
Sports uniform 65.8
Monochrome photography 65.5
Team sport 63.7
Monochrome 63.6
Room 62.8
Sports jersey 62.1
Team 59
Sports 55.9
Rectangle 51.5

Microsoft
created on 2022-03-05

text 97.7
outdoor 96.7
person 96.5
baseball 91.4
man 90.7
standing 77.7
white 76.9
player 74.8
clothing 73.4
sports uniform 73.1
black 72
posing 59
team 29.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 99.3%
Calm 98.2%
Sad 1.1%
Confused 0.4%
Angry 0.1%
Happy 0.1%
Disgusted 0.1%
Surprised 0%
Fear 0%

AWS Rekognition

Age 24-34
Gender Male, 99.9%
Calm 94.1%
Sad 4.4%
Confused 1%
Disgusted 0.2%
Happy 0.1%
Surprised 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 23-33
Gender Male, 98.6%
Calm 93.6%
Surprised 2.2%
Happy 1.9%
Angry 0.9%
Sad 0.7%
Confused 0.5%
Disgusted 0.2%
Fear 0%

AWS Rekognition

Age 40-48
Gender Female, 59.4%
Calm 98%
Sad 1.8%
Confused 0.1%
Angry 0%
Happy 0%
Disgusted 0%
Fear 0%
Surprised 0%

AWS Rekognition

Age 38-46
Gender Male, 99.8%
Sad 73.3%
Calm 20.8%
Happy 2.2%
Angry 1.9%
Confused 0.8%
Disgusted 0.6%
Surprised 0.3%
Fear 0.1%

AWS Rekognition

Age 39-47
Gender Male, 78.5%
Calm 61.3%
Sad 22.7%
Happy 8.2%
Confused 4.5%
Surprised 1.3%
Disgusted 0.9%
Angry 0.8%
Fear 0.3%

AWS Rekognition

Age 40-48
Gender Male, 98.3%
Calm 68.8%
Sad 19%
Happy 6.6%
Confused 2%
Surprised 1.3%
Disgusted 1.1%
Angry 0.9%
Fear 0.2%

AWS Rekognition

Age 45-53
Gender Male, 98.9%
Calm 99.5%
Happy 0.2%
Confused 0.1%
Sad 0.1%
Disgusted 0%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 38-46
Gender Male, 85.6%
Happy 90.1%
Sad 5.4%
Calm 2.1%
Confused 1.5%
Disgusted 0.3%
Angry 0.3%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 34-42
Gender Male, 85.5%
Calm 94.1%
Surprised 2.8%
Sad 1.2%
Angry 0.7%
Confused 0.6%
Fear 0.3%
Happy 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.6%
Person 99.6%
Person 99.5%
Person 99.5%
Person 99.4%
Person 99.3%
Person 99.1%
Person 99.1%
Person 99.1%
Person 98.9%
Person 98.8%

Categories

Text analysis

Amazon

EXIT
H
Smoking
E
HUP
H DED
H RE
No Smoking
HORA
DED
RE
HUR!!
KODAKA
No
HURRY
HARP
HUBE

Google

EXIT EXIT
EXIT