Human Generated Data

Title

Untitled (woman playing piano for children)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17004

Human Generated Data

Title

Untitled (woman playing piano for children)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17004

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.3
Human 99.3
Chair 99.2
Furniture 99.2
Person 96.5
Person 95.2
Person 94.9
Face 93.1
Person 88.3
Table 85.3
Person 83.5
Clothing 82.6
Apparel 82.6
Person 82.4
Indoors 80.9
Pc 79.6
Electronics 79.6
Computer 79.6
Person 78.9
Monitor 75.7
Display 75.7
Screen 75.7
Kid 74.5
Child 74.5
Sitting 73.6
Female 72.3
Room 71.9
Desk 71
Girl 66.1
Portrait 66
Photography 66
Photo 66
LCD Screen 64.6
Play 59
Dining Table 57.4
Collage 56.4
Advertisement 56.4
Poster 56.4
Suit 55.9
Coat 55.9
Overcoat 55.9
Shorts 55.1
Person 52.3

Clarifai
created on 2023-10-29

people 99.8
group 99.5
education 98.5
school 97.7
classroom 97
child 96.8
adult 96.4
elementary school 95.8
group together 95.6
man 94.9
teacher 94.1
many 93.9
woman 93.4
boy 92.3
sitting 92
sit 86.9
room 85.7
adolescent 83.7
furniture 82
indoors 80

Imagga
created on 2022-02-26

teacher 45.2
person 34.9
classroom 32.6
man 30.9
male 29.8
people 26.8
educator 26
adult 24.7
professional 24.2
businessman 23.8
room 23.3
silhouette 20.7
business 20
blackboard 19.1
group 17.7
men 15.5
education 14.7
school 14.3
student 13.9
spectator 13.7
team 13.4
black 12.6
happy 12.5
chart 12.4
board 11.9
work 11.8
job 11.5
design 11.2
class 10.6
boy 10.4
teamwork 10.2
symbol 10.1
dark 10
hand 9.9
portrait 9.7
success 9.7
crowd 9.6
diagram 9.6
building 9.4
meeting 9.4
office 9.2
stage 8.9
teaching 8.8
vibrant 8.7
couple 8.7
gesture 8.6
casual 8.5
two 8.5
modern 8.4
flag 8.4
manager 8.4
lights 8.3
performer 8.3
player 8
water 8
hall 7.9
love 7.9
cheering 7.8
nighttime 7.8
audience 7.8
stadium 7.8
child 7.7
patriotic 7.7
performance 7.7
boss 7.6
finance 7.6
nation 7.6
human 7.5
indoor 7.3
teenager 7.3
businesswoman 7.3
looking 7.2
newspaper 7.2
musician 7.2
chair 7.2
bright 7.1
financial 7.1
icon 7.1
women 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.6
drawing 82.4
person 78.9
cartoon 62.9
clothing 51.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-35
Gender Male, 99.7%
Calm 69.6%
Sad 12%
Happy 10.8%
Angry 3.6%
Surprised 1.9%
Disgusted 0.7%
Confused 0.6%
Fear 0.6%

AWS Rekognition

Age 27-37
Gender Male, 100%
Calm 69.5%
Sad 17.4%
Angry 9.9%
Disgusted 0.9%
Surprised 0.8%
Confused 0.5%
Fear 0.5%
Happy 0.4%

AWS Rekognition

Age 33-41
Gender Male, 98.8%
Calm 73.1%
Sad 11.3%
Happy 7.9%
Angry 4.8%
Surprised 1.2%
Fear 0.6%
Confused 0.6%
Disgusted 0.5%

AWS Rekognition

Age 31-41
Gender Male, 94.4%
Calm 99.8%
Sad 0.1%
Happy 0%
Surprised 0%
Confused 0%
Disgusted 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 13-21
Gender Male, 98.9%
Calm 47.3%
Angry 43.4%
Surprised 4.3%
Sad 2.8%
Disgusted 0.9%
Confused 0.6%
Fear 0.4%
Happy 0.3%

Feature analysis

Amazon

Person
Person 99.3%
Person 96.5%
Person 95.2%
Person 94.9%
Person 88.3%
Person 83.5%
Person 82.4%
Person 78.9%
Person 52.3%

Captions

Text analysis

Amazon

2

Google

2 MJIA- -YT3RA°2-->AGON
2
MJIA-
-YT3RA°2-->AGON