Human Generated Data

Title

Untitled (group portrait of teacher and young students in classroom)

Date

1955-1956

People

Artist: Claseman Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11042

Human Generated Data

Title

Untitled (group portrait of teacher and young students in classroom)

People

Artist: Claseman Studio, American 20th century

Date

1955-1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11042

Machine Generated Data

Tags

Amazon
created on 2019-03-26

Indoors 99.9
Classroom 99.9
Room 99.9
School 99.9
Human 96.2
Person 96.2
Person 96.2
Person 95.4
Person 94.8
Person 90.4
Person 89.6
Furniture 85
Chair 85
Chair 84.8
Person 82.6
Person 81.1
Person 77.7
Person 73.1
Person 71.9
Person 65.5
Chair 63.7
Chair 62.6
Person 60.4
Chair 59.8
Chair 58.8
Person 56.1
Chair 55.3
Person 49.8

Clarifai
created on 2019-03-26

people 99.8
group 99.5
classroom 99.3
education 99
group together 98.8
many 98.3
furniture 98.2
chair 97.9
teacher 97.3
adult 97.2
room 96.9
desk 96.8
school 96.2
man 95.3
woman 94
seat 92.9
leader 92.7
administration 92.4
sit 90.9
elementary school 90.1

Imagga
created on 2019-03-26

classroom 100
room 100
table 55.5
interior 54
chair 50.6
restaurant 48.4
cafeteria 34.9
furniture 33.6
dining 28.5
modern 28
house 27.6
decor 24.8
floor 24.2
hall 24.1
indoors 23.7
seat 23.6
wood 23.4
empty 22.3
design 22
home 20.7
chairs 20.6
dinner 20.3
tables 19.7
building 19.4
glass 17.9
kitchen 17
food 17
drink 16.7
inside 16.6
indoor 16.4
eat 15.9
stool 15.8
lunch 15.6
structure 15.6
luxury 15.4
meal 14.7
light 14
window 13.7
comfortable 13.4
hotel 13.4
style 13.4
architecture 13.3
contemporary 13.2
decoration 13
bar 12.9
office 12.6
business 12.2
place 12.1
coffee 12
party 12
wine 12
stove 11.8
setting 11.6
lamp 11.4
urban 11.4
desk 11.3
plant 11.2
service 11.1
elegance 10.9
counter 10.9
people 10.6
scene 10.4
wall 10.3
event 10.2
nobody 10.1
catering 9.8
diner 9.8
conference 9.8
work 9.4
meeting 9.4
life 9.4
3d 9.3
city 9.1
group 8.9
oven 8.8
reception 8.8
education 8.7
residential 8.6
apartment 8.6
elegant 8.6
banquet 8.5
learning 8.5
study 8.4
cook 8.2
cabinets 7.9
school 7.9
refrigerator 7.9
cutlery 7.8
male 7.8
decorate 7.6
tile 7.6
relax 7.6
wedding 7.4
day 7.1
wooden 7

Google
created on 2019-03-26

Classroom 97
Class 94.8
Room 93.2
Table 79
Furniture 74.3
State school 67.2
School 58.1
Black-and-white 56.4
Building 53.4
Education 51.8
Cafeteria 51.4

Microsoft
created on 2019-03-26

indoor 95.4
floor 91.2
window 88.4
room 54.8
dining table 16.7
classroom 16.7
library 13.5
school 10.4
person 6.4
meeting 5.6
competition 3.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Female, 54.8%
Surprised 45.6%
Confused 45.9%
Disgusted 45.2%
Happy 45.2%
Sad 51.1%
Angry 45.4%
Calm 46.5%

AWS Rekognition

Age 26-43
Gender Male, 55%
Confused 46.1%
Surprised 45.6%
Sad 50.4%
Disgusted 45.2%
Happy 45.3%
Calm 46.9%
Angry 45.6%

AWS Rekognition

Age 26-43
Gender Female, 51.7%
Disgusted 45.1%
Confused 45.1%
Sad 53.3%
Happy 45.1%
Surprised 45.1%
Calm 46%
Angry 45.2%

AWS Rekognition

Age 27-44
Gender Female, 50%
Disgusted 49.7%
Angry 49.6%
Sad 49.7%
Calm 49.7%
Surprised 49.6%
Confused 49.6%
Happy 49.6%

AWS Rekognition

Age 35-52
Gender Female, 50.5%
Confused 49.6%
Angry 49.6%
Surprised 49.6%
Disgusted 49.6%
Calm 49.6%
Happy 49.9%
Sad 49.7%

AWS Rekognition

Age 35-52
Gender Male, 52.8%
Calm 51%
Surprised 45.1%
Happy 45.7%
Confused 45.4%
Disgusted 45.1%
Sad 47.5%
Angry 45.2%

AWS Rekognition

Age 15-25
Gender Male, 55%
Sad 46.6%
Angry 45.3%
Disgusted 45.2%
Calm 48%
Surprised 45.8%
Confused 48.5%
Happy 45.5%

AWS Rekognition

Age 26-43
Gender Male, 54.8%
Surprised 45.1%
Sad 47%
Calm 51.7%
Confused 45.2%
Disgusted 45.1%
Happy 45.3%
Angry 45.5%

AWS Rekognition

Age 26-43
Gender Female, 51.9%
Confused 45.4%
Surprised 46.2%
Sad 45.8%
Disgusted 45.2%
Happy 45.5%
Calm 50.7%
Angry 46.3%

AWS Rekognition

Age 9-14
Gender Male, 50%
Angry 49.6%
Sad 49.9%
Surprised 49.6%
Confused 49.5%
Calm 49.6%
Happy 49.8%
Disgusted 49.5%

AWS Rekognition

Age 48-68
Gender Male, 50.3%
Happy 49.5%
Sad 49.5%
Angry 49.5%
Disgusted 49.5%
Surprised 49.5%
Calm 50.4%
Confused 49.5%

AWS Rekognition

Age 35-52
Gender Male, 50.2%
Disgusted 49.6%
Angry 49.6%
Surprised 49.6%
Calm 50%
Happy 49.6%
Sad 49.6%
Confused 49.6%

AWS Rekognition

Age 26-43
Gender Female, 54.8%
Angry 45.2%
Happy 45.4%
Sad 47.5%
Calm 51.2%
Confused 45.2%
Disgusted 45.2%
Surprised 45.3%

AWS Rekognition

Age 23-38
Gender Male, 50.1%
Sad 49.6%
Angry 49.6%
Happy 50.1%
Disgusted 49.6%
Surprised 49.5%
Confused 49.5%
Calm 49.6%

AWS Rekognition

Age 26-43
Gender Female, 54.1%
Calm 48.4%
Disgusted 45.5%
Sad 46.7%
Confused 46.9%
Happy 45.7%
Surprised 46%
Angry 45.8%

AWS Rekognition

Age 27-44
Gender Female, 50.3%
Angry 49.6%
Sad 49.6%
Surprised 49.5%
Confused 49.5%
Calm 49.5%
Happy 49.6%
Disgusted 50.1%

Feature analysis

Amazon

Person 96.2%
Chair 85%

Categories

Text analysis

Amazon

XAGOX
t