Human Generated Data

Title

Untitled (presenters and attendees at Vogel Furniture Store Cooking School)

Date

1949

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2513

Human Generated Data

Title

Untitled (presenters and attendees at Vogel Furniture Store Cooking School)

People

Artist: Harry Annas, American 1897 - 1980

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2513

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Classroom 99.6
Room 99.6
School 99.6
Indoors 99.6
Person 99.2
Human 99.2
Audience 98.5
Crowd 98.5
Person 98.4
Interior Design 97.1
Person 96.7
Person 94.8
Chair 94.7
Furniture 94.7
Person 94.3
Person 94.1
Person 92
Person 89
Person 86.4
Person 79.3
Person 75.5
Person 75.1
Seminar 75.1
Lecture 75.1
Speech 75.1
People 70.9
Person 68.1
Person 62.7
Clothing 59.6
Apparel 59.6
Person 59
Cafeteria 58.4
Restaurant 58.4
Workshop 58.1
Meal 57.6
Food 57.6
Person 54
Person 51.8
Person 46.4
Person 43.6

Clarifai
created on 2023-10-28

people 100
many 99.5
group 99.2
adult 98.5
woman 98.2
man 96.7
education 95.9
crowd 94.9
child 94.8
group together 94.6
indoors 92.7
administration 91.1
school 90
monochrome 89.5
meeting 87.4
classroom 86.8
furniture 86.5
audience 86.4
leader 85.1
elementary school 82.2

Imagga
created on 2022-03-05

restaurant 40.2
table 27.7
cafeteria 24.7
hall 24.1
people 24
room 23.8
interior 22.1
dinner 20.5
chair 20.1
party 18.9
musical instrument 17.5
group 16.9
indoors 16.7
person 16.4
teacher 16.2
business 15.8
steel drum 15.3
brass 15.1
dining 14.3
glass 14.2
drink 14.2
modern 14
building 13.9
percussion instrument 13.9
men 13.7
lunch 13.2
life 12.6
meal 12.4
wine 12.2
celebration 12
women 11.8
stage 11.4
banquet 11
food 11
tables 10.8
man 10.7
design 10.7
decor 10.6
crowd 10.5
wind instrument 10.5
meeting 10.4
work 10.3
luxury 10.3
service 10.2
eat 10.1
office 10
male 9.9
chairs 9.8
adult 9.6
urban 9.6
decoration 9.4
light 9.3
event 9.2
horizontal 9.2
wedding 9.2
city 9.1
shop 9
seat 8.9
reception 8.8
napkin 8.8
setting 8.7
classroom 8.6
structure 8.5
alcohol 8.5
elegance 8.4
bar 8.3
occupation 8.2
human 8.2
spectator 8.2
cutlery 7.8
scene 7.8
concert 7.8
educator 7.7
empty 7.7
furniture 7.7
music 7.5
professional 7.5
presentation 7.4
floor 7.4
indoor 7.3
board 7.2
counter 7
musician 7

Google
created on 2022-03-05

Photograph 94.2
Black 89.7
Hat 86.5
Black-and-white 85.3
Style 83.9
Line 81.8
People 78.1
Suit 75.7
Monochrome photography 74.3
Snapshot 74.3
Monochrome 73.8
Event 73.3
Room 70.3
Team 70.3
Art 68.7
Crowd 68.3
Audience 68.1
T-shirt 66.2
Font 66
Stock photography 63.5

Microsoft
created on 2022-03-05

person 99.4
text 88.1
table 74.7
people 65.5
group 64.1
clothing 59.8
crowd 0.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 16-24
Gender Female, 83.9%
Calm 55.6%
Happy 25.1%
Sad 8.4%
Angry 3.2%
Fear 2.5%
Confused 2%
Surprised 1.6%
Disgusted 1.5%

AWS Rekognition

Age 20-28
Gender Female, 90.7%
Sad 97.3%
Calm 1.1%
Confused 0.7%
Angry 0.4%
Happy 0.2%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 19-27
Gender Female, 99.2%
Calm 62.9%
Happy 28.9%
Sad 3.4%
Surprised 2.1%
Confused 1.1%
Angry 0.6%
Disgusted 0.6%
Fear 0.5%

AWS Rekognition

Age 18-24
Gender Male, 71.9%
Fear 44.7%
Calm 35.2%
Sad 14.1%
Happy 2.2%
Confused 1.4%
Angry 1%
Disgusted 0.8%
Surprised 0.7%

AWS Rekognition

Age 19-27
Gender Male, 98.8%
Calm 36.5%
Confused 35.1%
Sad 17.3%
Surprised 3.6%
Happy 2.3%
Disgusted 2%
Fear 2%
Angry 1.3%

AWS Rekognition

Age 23-31
Gender Female, 68.3%
Calm 97.8%
Happy 0.6%
Angry 0.4%
Sad 0.4%
Disgusted 0.3%
Surprised 0.3%
Fear 0.1%
Confused 0.1%

AWS Rekognition

Age 21-29
Gender Male, 56.1%
Happy 42%
Sad 37.8%
Calm 7.9%
Confused 3.9%
Disgusted 3.3%
Angry 2.4%
Fear 1.6%
Surprised 1.1%

AWS Rekognition

Age 18-24
Gender Female, 62.9%
Sad 33.6%
Fear 21.1%
Confused 19.5%
Calm 14.6%
Surprised 4.1%
Disgusted 3.8%
Angry 1.9%
Happy 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.2%
Person 98.4%
Person 96.7%
Person 94.8%
Person 94.3%
Person 94.1%
Person 92%
Person 89%
Person 86.4%
Person 79.3%
Person 75.5%
Person 75.1%
Person 68.1%
Person 62.7%
Person 59%
Person 54%
Person 51.8%
Person 46.4%
Person 43.6%

Text analysis

Amazon

RANGES
Maytag
will Maytag
will
Journe
LICEN
Done

Google

YT3RA2-XAGOX
YT3RA2-XAGOX