Human Generated Data

Title

Untitled (studio portrait of Hodgdon, ME female basketball team standing in profile)

Date

1940

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10985

Human Generated Data

Title

Untitled (studio portrait of Hodgdon, ME female basketball team standing in profile)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.9
Human 99.9
Person 99.7
Person 99.7
Person 99.6
Person 99.5
Person 99.4
Person 99
Person 99
Person 98.6
Apparel 97.5
Footwear 97.5
Clothing 97.5
Shoe 97.5
Shoe 96.6
Shoe 95.7
Shoe 95.2
Shoe 94.7
Shoe 94.1
Shoe 89.2
People 78.6
Stage 70.8
Leisure Activities 69.9
Face 65.2
Meal 63.9
Food 63.9
Helmet 63.2
Shorts 62.2
Female 61.6
Girl 59.6
Dance Pose 58.4
Dish 57.7
Coat 56.2
Sleeve 55.9

Imagga
created on 2022-02-05

people 34
person 31.7
musical instrument 28.3
man 24.8
dancer 24.5
silhouette 24
male 23.4
group 23.4
men 21.5
performer 20.2
wind instrument 20
world 16.9
businessman 16.8
crowd 16.3
women 15
adult 14.9
couple 14.8
business 14.6
entertainer 14.6
dress 14.4
brass 13.8
teamwork 13
marimba 12.6
sunset 12.6
happiness 12.5
player 12
happy 11.3
percussion instrument 11.1
beach 10.9
kin 10.9
lifestyle 10.8
team 10.7
accordion 10.7
uniform 10.7
professional 10.2
fashion 9.8
human 9.7
success 9.6
summer 9.6
together 9.6
sexy 9.6
black 9.6
bride 9.6
businesswoman 9.1
family 8.9
celebration 8.8
stage 8.7
contestant 8.7
love 8.7
work 8.6
keyboard instrument 8.6
walking 8.5
youth 8.5
casual 8.5
two 8.5
design 8.4
friendship 8.4
pose 8.1
suit 8.1
athlete 8
ballplayer 8
water 8
dance 8
job 8
smiling 7.9
holiday 7.9
boy 7.8
sea 7.8
model 7.8
travel 7.7
party 7.7
attractive 7.7
ocean 7.5
girls 7.3
sun 7.2
platform 7.1
sky 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

posing 98.5
person 96.3
standing 94.3
clothing 90.6
text 90.4
group 90
smile 68
footwear 64.9

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 54%
Calm 99.8%
Sad 0.1%
Happy 0%
Confused 0%
Surprised 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 29-39
Gender Male, 55.5%
Happy 99%
Calm 0.6%
Surprised 0.1%
Fear 0.1%
Sad 0%
Disgusted 0%
Confused 0%
Angry 0%

AWS Rekognition

Age 34-42
Gender Female, 84.9%
Calm 100%
Sad 0%
Surprised 0%
Confused 0%
Angry 0%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 26-36
Gender Male, 99.5%
Happy 72.1%
Calm 15.2%
Surprised 11.4%
Sad 0.3%
Disgusted 0.3%
Fear 0.3%
Angry 0.2%
Confused 0.2%

AWS Rekognition

Age 20-28
Gender Male, 89.5%
Happy 88%
Surprised 8.3%
Calm 2.3%
Angry 0.5%
Disgusted 0.4%
Confused 0.2%
Sad 0.1%
Fear 0.1%

AWS Rekognition

Age 23-33
Gender Male, 96.5%
Calm 99.8%
Sad 0.1%
Happy 0%
Confused 0%
Surprised 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 27-37
Gender Female, 57.5%
Happy 86.5%
Calm 11%
Sad 1.2%
Surprised 0.5%
Confused 0.3%
Fear 0.2%
Disgusted 0.2%
Angry 0.1%

AWS Rekognition

Age 26-36
Gender Female, 65.3%
Happy 67.1%
Calm 27.1%
Sad 4.6%
Confused 0.6%
Surprised 0.3%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 28-38
Gender Male, 87.2%
Calm 100%
Happy 0%
Sad 0%
Disgusted 0%
Fear 0%
Confused 0%
Surprised 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%
Shoe 97.5%
Helmet 63.2%

Captions

Microsoft

a group of people posing for a photo 97%
a group of people posing for a picture 96.9%
a group of people posing for the camera 96.8%

Text analysis

Amazon

9-40
193 9-40
MJ13 YT33A2
MJ13 YT33A2 АЗДА
АЗДА
193

Google

39-40
39-40