Human Generated Data

Title

Untitled (children listening to story in classroom)

Date

1952

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21554

Human Generated Data

Title

Untitled (children listening to story in classroom)

People

Artist: John Howell, American active 1930s-1960s

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21554

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.8
Human 98.8
Person 95.9
Person 95.9
Person 95.3
Person 93.6
Person 93.2
Kindergarten 92.3
Person 86.3
Furniture 83.7
Interior Design 81.7
Indoors 81.7
Room 77.8
Shorts 77.1
Clothing 77.1
Apparel 77.1
People 72.1
Person 72
School 69.1
Floor 66.6
Person 64
Classroom 63.9
Person 61.1
Screen 59.7
Electronics 59.7
Monitor 59.2
Display 59.2
Flooring 58.7
Person 54.9
Chair 51.8

Clarifai
created on 2023-10-22

people 99.9
child 99.9
group 99.8
group together 99.3
many 99.2
education 97.8
boy 97.7
school 96.6
several 96.5
recreation 95
monochrome 93.6
elementary school 91.7
adult 91
classroom 90.3
woman 89
teacher 89
room 88.5
man 88.3
music 87.4
family 86.4

Imagga
created on 2022-03-05

wind instrument 30.2
brass 29.3
man 27.5
musical instrument 26.6
people 25.1
sax 23.8
trombone 22.7
male 19.8
men 17.2
person 15.9
adult 15.8
business 15.8
city 14.1
urban 14
black 13.2
businessman 12.3
teacher 12.3
interior 11.5
group 11.3
motion 11.1
team 10.7
shop 10.6
indoors 10.5
room 10.4
bass 9.9
device 9.9
style 9.6
stringed instrument 9.6
professional 9.5
women 9.5
play 9.5
lifestyle 9.4
active 9.4
sport 9.3
indoor 9.1
silhouette 9.1
crowd 8.6
youth 8.5
music 8.4
modern 8.4
power 8.4
floor 8.4
human 8.2
girls 8.2
happy 8.1
suit 8.1
home 8
day 7.8
classroom 7.8
boy 7.8
move 7.7
walk 7.6
studio 7.6
dark 7.5
holding 7.4
sword 7.4
street 7.4
speed 7.3
life 7.3
success 7.2
office 7.2
equipment 7.1
portrait 7.1
job 7.1
travel 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

clothing 89.1
text 86.7
person 86.6
table 62.4
woman 59
black and white 56.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 97.7%
Calm 82.4%
Happy 14.3%
Sad 1.5%
Disgusted 0.8%
Confused 0.3%
Angry 0.3%
Surprised 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 98.8%
Person 95.9%
Person 95.9%
Person 95.3%
Person 93.6%
Person 93.2%
Person 86.3%
Person 72%
Person 64%
Person 61.1%
Person 54.9%
Chair 51.8%

Categories

Imagga

interior objects 97.2%
paintings art 1.6%

Text analysis

Amazon

80
KODVK-SVEELA

Google

YT37A2- AGON
YT37A2-
AGON