Human Generated Data

Title

Untitled (children sitting at a table, eating)

Date

1958

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16458

Human Generated Data

Title

Untitled (children sitting at a table, eating)

People

Artist: Lucian and Mary Brown, American

Date

1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16458

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.5
Human 99.5
Person 90
Person 89.9
Chair 84.3
Furniture 84.3
Person 78.6
People 72.1
Cafeteria 71.7
Restaurant 71.7
Person 71
Person 68.4
Person 67.3
Person 66.9
Helmet 66.9
Clothing 66.9
Apparel 66.9
Suit 64.8
Coat 64.8
Overcoat 64.8
Table 63.8
Room 59.5
Indoors 59.5
Machine 58.8
Crowd 58.7
Sitting 58
Classroom 57.1
School 57.1
Spoke 55

Clarifai
created on 2023-10-28

people 99.8
classroom 99.2
education 99.2
group 98.6
furniture 98.4
school 98.4
sit 98.2
elementary school 98
room 98
child 97.8
monochrome 97.6
chair 97.2
indoors 97.2
desk 97
many 96.8
man 96.7
adult 96.5
sitting 96.2
teacher 96
group together 95.5

Imagga
created on 2022-02-11

classroom 93.8
room 93.1
interior 28.3
chair 28.1
table 26
modern 22.4
business 18.8
floor 18.6
person 18.5
man 18.1
people 17.8
house 16.7
indoors 16.7
furniture 16.7
office 16.4
glass 16.3
male 16.3
work 15.7
design 14.6
home 14.3
blackboard 14
brass 13
men 12.9
kitchen 12.8
desk 12.6
architecture 12.5
businessman 12.3
building 12.1
group 12.1
inside 11.9
adult 11.9
indoor 10.9
wind instrument 10.8
light 10.7
happy 10.6
empty 10.3
job 9.7
hospital 9.6
urban 9.6
education 9.5
hall 9.4
city 9.1
board 9
team 8.9
medical 8.8
clinic 8.7
teacher 8.7
window 8.7
decoration 8.7
class 8.7
scene 8.6
smile 8.5
wood 8.3
bar 8.3
laptop 8.3
equipment 8.1
computer 8.1
musical instrument 8
decor 7.9
working 7.9
women 7.9
center 7.8
space 7.7
wall 7.7
apartment 7.7
exam 7.7
casual 7.6
restaurant 7.6
meeting 7.5
back 7.3
cheerful 7.3
success 7.2
smiling 7.2
lifestyle 7.2
life 7.1
portrait 7.1
worker 7.1
medicine 7
travel 7

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

text 99.9
furniture 88.3
chair 84.1
person 78.8
table 59.9
old 41.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 65.6%
Calm 28.9%
Happy 27%
Surprised 23.5%
Angry 8.3%
Fear 7.6%
Sad 2.7%
Disgusted 1.1%
Confused 0.9%

AWS Rekognition

Age 24-34
Gender Male, 99.8%
Calm 64.4%
Surprised 28%
Angry 2.7%
Sad 2.2%
Happy 0.9%
Fear 0.9%
Confused 0.5%
Disgusted 0.3%

AWS Rekognition

Age 9-17
Gender Male, 99.2%
Surprised 49.3%
Calm 33.6%
Happy 11.3%
Angry 1.6%
Disgusted 1.4%
Sad 1.2%
Fear 1%
Confused 0.6%

AWS Rekognition

Age 34-42
Gender Female, 50.5%
Calm 79.4%
Sad 14.7%
Happy 2%
Fear 1.5%
Confused 1.3%
Surprised 0.5%
Disgusted 0.3%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Helmet
Person 99.5%
Person 90%
Person 89.9%
Person 78.6%
Person 71%
Person 68.4%
Person 67.3%
Person 66.9%
Chair 84.3%
Helmet 66.9%

Categories

Imagga

interior objects 99.6%

Text analysis

Amazon

23

Google

MJIA- -YT3A°2- -XAGON
MJIA-
-YT3A°2-
-XAGON