Human Generated Data

Title

Untitled (girls tennis team)

Date

1925

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1903

Human Generated Data

Title

Untitled (girls tennis team)

People

Artist: Hamblin Studio, American active 1930s

Date

1925

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1903

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 98.8
Human 98.8
Person 98.6
Person 98.1
Person 97.9
Person 97.6
Person 97.5
Person 97.3
Clothing 97
Apparel 97
Person 93.6
Plant 92.8
Vegetation 92.8
Person 90.1
Grass 90.1
Meal 88.9
Food 88.9
Tree 87.2
Dress 85.4
Outdoors 85.3
Female 84.6
People 83
Shorts 82.7
Face 80.6
Poster 79
Advertisement 79
Land 78.6
Nature 78.6
Leisure Activities 76.9
Yard 76.7
Vacation 76.5
Person 75.8
Person 73.6
Person 71.8
Person 70.5
Girl 70.4
Woodland 69.3
Forest 69.3
Photography 65.6
Photo 65.6
Picnic 65
Portrait 62.9
Kid 61.8
Child 61.8
Woman 60.4
Person 57
Field 55.7
Person 55.6
Person 44.4

Clarifai
created on 2023-10-25

people 99.9
child 98.9
group 98
boy 96.5
adult 96.2
many 95.9
man 95
wear 93.8
group together 93.7
education 93.3
uniform 92.5
school 88.8
outfit 88.6
woman 88.4
veil 85.8
retro 84.3
nostalgic 82.8
portrait 82.6
snapshot 82.5
nostalgia 81.7

Imagga
created on 2021-12-14

swing 100
mechanical device 84.6
plaything 83.9
mechanism 62.9
child 19.3
people 17.3
person 17
man 15.4
outdoor 14.5
fun 14.2
play 13.8
dark 13.4
black 13.2
male 12.8
happy 12.5
park 12.3
sibling 11.6
portrait 11
sport 10.9
resort area 10.8
childhood 10.7
dirty 9.9
silhouette 9.9
adult 9.8
summer 9.6
sexy 9.6
body 9.6
boy 9.6
enjoy 9.4
happiness 9.4
playing 9.1
fashion 9
sunset 9
recreation 9
outdoors 9
world 8.6
day 8.6
grunge 8.5
beach 8.4
area 8.3
vintage 8.3
one 8.2
danger 8.2
vacation 8.2
active 8.1
water 8
kid 8
grass 7.9
art 7.8
old 7.7
sky 7.6
power 7.6
joy 7.5
leisure 7.5
style 7.4
light 7.3
sun 7.2
dress 7.2
hair 7.1
family 7.1
love 7.1
little 7.1

Google
created on 2021-12-14

Adaptation 79.3
Font 79.3
Tints and shades 76.3
Monochrome photography 72.4
Monochrome 72.2
Tree 70.4
Event 68
History 66.5
Visual arts 66.2
Vintage clothing 65.4
Room 62.9
Crew 62.6
Suit 61.8
Art 60.6
Grass 60.1
Photographic paper 55.3
Team 54.9
Sitting 51.2
Photo caption 50.7

Microsoft
created on 2021-12-14

text 98
person 93.6
clothing 83.4
people 74.8
posing 62.7
group 61.6
crowd 1.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-37
Gender Male, 78.3%
Calm 62.8%
Happy 29.4%
Sad 2.6%
Confused 2%
Fear 1.4%
Surprised 0.9%
Disgusted 0.8%
Angry 0.2%

AWS Rekognition

Age 26-42
Gender Female, 52.7%
Angry 68.9%
Disgusted 11.1%
Fear 6.4%
Happy 5%
Calm 2.9%
Surprised 2.3%
Sad 1.7%
Confused 1.7%

AWS Rekognition

Age 23-35
Gender Female, 61.2%
Calm 88.8%
Sad 8.9%
Happy 0.9%
Confused 0.5%
Surprised 0.3%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 1-7
Gender Female, 96%
Calm 79.9%
Sad 10.1%
Happy 9.2%
Confused 0.4%
Surprised 0.2%
Angry 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 28-44
Gender Male, 91.9%
Calm 79.2%
Sad 11.8%
Happy 4.5%
Confused 2.6%
Angry 0.8%
Surprised 0.5%
Disgusted 0.4%
Fear 0.1%

AWS Rekognition

Age 38-56
Gender Female, 87.4%
Sad 75.1%
Happy 18.2%
Calm 4.4%
Confused 0.9%
Disgusted 0.6%
Angry 0.3%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 40-58
Gender Female, 72.6%
Happy 70.9%
Calm 13%
Angry 6.9%
Sad 5.8%
Confused 2.1%
Disgusted 0.7%
Surprised 0.5%
Fear 0.2%

AWS Rekognition

Age 29-45
Gender Male, 59.5%
Happy 34.3%
Surprised 18.5%
Sad 14.8%
Calm 12.3%
Fear 8.2%
Confused 6.6%
Angry 3.4%
Disgusted 1.9%

AWS Rekognition

Age 41-59
Gender Female, 91.1%
Happy 69.6%
Calm 26.9%
Confused 1.9%
Sad 0.6%
Surprised 0.4%
Disgusted 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 35-51
Gender Male, 92.3%
Calm 50.6%
Sad 22.1%
Happy 11.3%
Angry 5.2%
Confused 4.1%
Surprised 3.9%
Fear 1.5%
Disgusted 1.2%

AWS Rekognition

Age 36-52
Gender Female, 68.9%
Happy 93.4%
Calm 4.2%
Surprised 1.4%
Confused 0.4%
Sad 0.3%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 50-68
Gender Female, 81%
Calm 49.5%
Happy 27.5%
Sad 14.4%
Confused 4.5%
Angry 2.8%
Disgusted 0.6%
Surprised 0.4%
Fear 0.2%

AWS Rekognition

Age 36-52
Gender Female, 87.3%
Happy 68.1%
Calm 20.1%
Sad 4.3%
Surprised 3.1%
Angry 1.6%
Confused 1.3%
Disgusted 0.8%
Fear 0.7%

AWS Rekognition

Age 29-45
Gender Female, 80.2%
Calm 60.6%
Sad 32.5%
Happy 2.8%
Confused 1.8%
Angry 1.4%
Fear 0.4%
Disgusted 0.2%
Surprised 0.2%

AWS Rekognition

Age 47-65
Gender Female, 92.7%
Happy 62%
Calm 20.9%
Sad 15.6%
Confused 0.5%
Angry 0.4%
Surprised 0.3%
Disgusted 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 98.8%
Poster 79%

Categories

Imagga

paintings art 98.3%