Human Generated Data

Title

Untitled (teacher and children standing in classroom)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17006

Human Generated Data

Title

Untitled (teacher and children standing in classroom)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17006

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.7
Human 99.7
Person 99.7
Person 99.7
Person 99.7
Person 99.6
Person 99.5
Person 99.2
Person 98.4
Person 97.9
Person 97.7
Clothing 97.3
Apparel 97.3
Shorts 95.4
Shoe 89.7
Footwear 89.7
Shoe 85.8
Play 82.8
Female 72.3
Kid 72.2
Child 72.2
Indoors 69.5
People 68.7
Room 66.7
Girl 60.8
Portrait 60.4
Photography 60.4
Face 60.4
Photo 60.4
Transportation 58.6
Pants 56.2
Vehicle 55.7
Skirt 55.6
Shoe 55.4
Person 48

Clarifai
created on 2023-10-28

people 99.8
child 99.7
group 99
education 98.1
group together 97.9
school 97.5
boy 97.4
elementary school 96.1
family 95.2
teacher 94.3
class 93.8
many 92.5
adult 92.5
man 89
several 88.6
woman 87.2
girl 85.9
monochrome 84.2
classroom 83.9
room 80.2

Imagga
created on 2022-02-26

sport 40
person 33.6
man 32.9
ball 31.6
male 28.4
people 26.2
athlete 26.1
dancer 25.6
player 23.5
adult 22.1
teacher 21.3
active 21.3
competition 20.1
performer 19.1
silhouette 19
soccer ball 18.9
exercise 18.1
fitness 17.2
couple 16.5
run 16.4
lifestyle 15.9
boy 15.6
game equipment 15.3
running 14.4
body 14.4
educator 14.3
professional 14.2
outdoors 14.2
golfer 14.1
training 13.9
men 13.7
equipment 13.6
sunset 13.5
fun 13.5
black 13.2
entertainer 13
group 12.9
grass 12.6
action 12.1
motion 12
dark 11.7
child 11.7
recreation 11.6
contestant 11.6
athletic 11.5
runner 11.1
sports equipment 11
love 10.3
sky 10.2
field 10
teenager 10
outdoor 9.9
attractive 9.8
together 9.6
women 9.5
happy 9.4
youth 9.4
sports 9.2
fit 9.2
leisure 9.1
world 9
health 9
team 9
game 8.9
kid 8.9
track 8.6
happiness 8.6
play 8.6
summer 8.4
human 8.2
healthy 8.2
vacation 8.2
dance 7.9
stadium 7.9
win 7.7
friendship 7.5
teen 7.4
water 7.3
girls 7.3
activity 7.2
portrait 7.1
to 7.1
businessman 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

footwear 92.3
person 90.1
clothing 86.9
sport 83.3
text 58
dance 57.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 53-61
Gender Female, 75.9%
Happy 75.2%
Sad 22.2%
Fear 0.7%
Angry 0.5%
Surprised 0.4%
Confused 0.4%
Calm 0.4%
Disgusted 0.3%

AWS Rekognition

Age 37-45
Gender Male, 95.9%
Calm 61.9%
Sad 19.3%
Happy 9.3%
Confused 3.2%
Disgusted 2.6%
Angry 1.6%
Fear 1.2%
Surprised 1%

AWS Rekognition

Age 18-26
Gender Female, 68.4%
Calm 69.3%
Sad 15.4%
Confused 5.1%
Fear 4.7%
Surprised 2.4%
Happy 1.8%
Angry 0.8%
Disgusted 0.6%

AWS Rekognition

Age 6-14
Gender Female, 99.7%
Sad 54.8%
Happy 36.3%
Calm 4.9%
Confused 1.1%
Fear 0.9%
Angry 0.8%
Disgusted 0.7%
Surprised 0.5%

AWS Rekognition

Age 21-29
Gender Female, 56.4%
Calm 45.8%
Sad 23.3%
Happy 16.9%
Surprised 6.8%
Confused 5%
Disgusted 1%
Angry 0.7%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.7%
Person 99.7%
Person 99.7%
Person 99.7%
Person 99.6%
Person 99.5%
Person 99.2%
Person 98.4%
Person 97.9%
Person 97.7%
Person 48%
Shoe 89.7%
Shoe 85.8%
Shoe 55.4%

Captions

Text analysis

Amazon

3
KODAKEIW