Human Generated Data

Title

Untitled (female student showing ceramic work to teacher in classroom with two male students working behind them)

Date

1952

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9392

Human Generated Data

Title

Untitled (female student showing ceramic work to teacher in classroom with two male students working behind them)

People

Artist: Martin Schweig, American 20th century

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9392

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.8
Human 99.8
Person 98.8
Person 97.4
Person 97.1
Chair 95.3
Furniture 95.3
Person 93.6
Person 83
Indoors 82.2
Room 77.3
Person 73.7
Person 71.1
Restaurant 66.3
Cafeteria 64.6
Person 61.9
Shelf 60.2
Table 57.8
Living Room 56.8

Clarifai
created on 2023-10-26

people 99.9
adult 98.7
furniture 96.2
administration 96
room 95.8
monochrome 95.8
group 95.8
group together 95.5
man 94.5
woman 94
leader 91.4
elderly 89.3
sit 89.2
chair 88.3
home 88.1
desk 86.4
education 84.8
newspaper 84.1
war 83.4
two 83.2

Imagga
created on 2022-01-23

laptop 58.5
office 56.6
computer 55.5
classroom 43.9
room 43.6
working 34.5
people 34
work 33.8
man 32.3
business 32.2
indoors 31.6
desk 29
adult 26.6
male 24.8
technology 24.5
happy 24.4
smiling 23.9
person 23.8
businessman 23
home 22.3
communication 21.8
businesswoman 21.8
library 19.9
sitting 19.8
professional 19.2
center 18.3
table 18.1
corporate 18
meeting 17.9
teamwork 17.6
notebook 17.3
talking 17.1
building 17
keyboard 16
job 15.9
workplace 15.3
lifestyle 15.2
team 14.3
worker 14.3
smile 14.3
modern 14
men 13.7
group 13.7
executive 13.1
education 13
monitor 13
occupation 12.8
student 12.7
businesspeople 12.3
together 12.3
success 12.1
attractive 11.9
indoor 11.9
portrait 11.7
30s 11.5
interior 11.5
wireless 11.4
couple 11.3
women 11.1
phone 11.1
casual 11
happiness 11
typing 10.7
looking 10.4
mature 10.2
camera 10.2
inside 10.1
house 10
structure 9.9
suit 9.9
discussing 9.8
pretty 9.8
cheerful 9.8
discussion 9.7
colleagues 9.7
school 9.7
busy 9.6
using 9.6
studying 9.6
reading 9.5
learning 9.4
senior 9.4
face 9.2
color 8.9
conversation 8.7
expression 8.5
contemporary 8.5
horizontal 8.4
20s 8.2
shop 8.2
engineer 7.9
conference 7.8
consultant 7.8
mid adult 7.7
restaurant 7.7
college 7.6
friends 7.5
manager 7.5
clothing 7.4
confident 7.3
handsome 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 97.3
text 95.6
furniture 70.3
black and white 68.9
window 60.5
house 59.8
table 58.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-22
Gender Male, 96%
Calm 83.6%
Sad 13.4%
Angry 0.8%
Happy 0.7%
Confused 0.6%
Disgusted 0.4%
Fear 0.3%
Surprised 0.2%

AWS Rekognition

Age 23-33
Gender Male, 94.4%
Calm 98.7%
Sad 0.5%
Angry 0.4%
Happy 0.1%
Disgusted 0.1%
Surprised 0.1%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 26-36
Gender Female, 85.3%
Calm 93.5%
Sad 3.3%
Surprised 0.8%
Angry 0.8%
Happy 0.6%
Confused 0.6%
Fear 0.2%
Disgusted 0.2%

Feature analysis

Amazon

Person 99.8%
Chair 95.3%

Categories

Text analysis

Amazon

KODVK-EVEELA