Human Generated Data

Title

Untitled (men in military uniforms seated at conference table with photos hanging on wall)

Date

c. 1950

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13586

Human Generated Data

Title

Untitled (men in military uniforms seated at conference table with photos hanging on wall)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13586

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Person 99.2
Human 99.2
Person 99.1
Person 99.1
Person 99.1
Person 99
Chair 98.2
Furniture 98.2
Chair 97.9
Person 97.8
Person 95.7
Person 95.2
Room 93.9
Indoors 93.9
Sitting 83.6
Person 81.6
People 75.7
Clothing 74
Apparel 74
Shoe 73.1
Footwear 73.1
Table 68.5
Person 65.2
Person 61.8
Living Room 58.8
Suit 58.5
Coat 58.5
Overcoat 58.5
Photography 58.4
Photo 58.4
Chair 54.8

Clarifai
created on 2023-10-28

people 99.7
group 99.2
education 98.1
room 97.9
man 97.5
adult 96.5
sit 95.6
indoors 95.2
woman 94.7
school 94.4
furniture 94.3
chair 94
group together 93.5
meeting 93.4
league 93
leader 90.2
sitting 88.4
classroom 88.2
many 87.8
teacher 87.7

Imagga
created on 2022-02-04

room 51.1
classroom 43.4
table 31.3
people 31.2
man 29.6
office 29.2
business 29.1
meeting 28.3
businessman 28.2
person 27.8
male 25.5
group 25
indoors 24.6
chair 24.4
interior 23.9
women 22.9
modern 22.4
team 22.4
men 22.3
teacher 22.1
adult 22.1
sitting 21.5
professional 20.1
executive 19.8
work 19.7
together 19.3
corporate 18.9
happy 18.8
brass 18.5
communication 18.5
businesswoman 18.2
teamwork 17.6
conference 17.6
restaurant 17
wind instrument 17
indoor 16.4
desk 16.1
hall 16
home 15.9
manager 15.8
laptop 15.8
worker 15.5
computer 15.3
talking 15.2
smiling 15.2
success 14.5
lifestyle 14.4
couple 13.9
education 13.9
suit 13.5
businesspeople 13.3
job 13.3
glass 13.2
musical instrument 12.9
floor 12.1
working 11.5
workplace 11.4
smile 11.4
student 11.3
togetherness 11.3
presentation 11.2
portrait 11
successful 11
class 10.6
educator 10.3
happiness 10.2
board 9.9
holding 9.9
furniture 9.9
chairs 9.8
cheerful 9.7
boss 9.6
design 9.6
ethnic 9.5
learning 9.4
study 9.3
coffee 9.3
handsome 8.9
tables 8.9
seminar 8.8
looking 8.8
diversity 8.6
friends 8.4
friendship 8.4
mature 8.4
company 8.4
diverse 7.8
black 7.8
discussion 7.8
color 7.8
two people 7.8
leader 7.7
sofa 7.7
finance 7.6
enjoying 7.6
plan 7.6
house 7.5
drink 7.5
life 7.4
building 7.4
training 7.4
employee 7.4
food 7.4
light 7.3
seat 7.3
new 7.3
outfit 7.2
day 7.1

Google
created on 2022-02-04

Furniture 94.8
Chair 92
Style 83.8
Black-and-white 83.1
Font 75.9
Suit 74.3
Event 73.4
Monochrome photography 73.4
Building 72.9
Monochrome 72.3
Art 69.2
Room 67.6
Sitting 66.3
Table 63.9
Stock photography 63.6
Team 56.9
Rectangle 56.2
History 54.5
Collaboration 53.7
Painting 53.3

Microsoft
created on 2022-02-04

wall 96.6
person 95.1
text 94
computer 89
laptop 83.3
furniture 77.3
clothing 64.8
table 63.3
room 41.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 72.3%
Happy 60.8%
Calm 29.4%
Sad 7.1%
Fear 1.2%
Confused 0.7%
Disgusted 0.4%
Angry 0.3%
Surprised 0.2%

AWS Rekognition

Age 33-41
Gender Male, 99.4%
Calm 31.5%
Sad 24.5%
Happy 18.2%
Confused 9.5%
Surprised 5.6%
Fear 5%
Disgusted 3.3%
Angry 2.3%

AWS Rekognition

Age 40-48
Gender Male, 99.7%
Calm 99.8%
Happy 0.1%
Confused 0.1%
Sad 0%
Disgusted 0%
Angry 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 25-35
Gender Female, 90.7%
Calm 55.5%
Fear 39%
Happy 1.9%
Sad 1.5%
Confused 0.7%
Surprised 0.5%
Disgusted 0.5%
Angry 0.4%

AWS Rekognition

Age 39-47
Gender Male, 96.8%
Sad 82.8%
Calm 10.8%
Happy 2.6%
Confused 1.2%
Angry 0.8%
Disgusted 0.7%
Fear 0.7%
Surprised 0.5%

AWS Rekognition

Age 23-31
Gender Female, 95.6%
Calm 97.4%
Sad 0.8%
Happy 0.7%
Angry 0.3%
Confused 0.3%
Disgusted 0.2%
Fear 0.2%
Surprised 0.1%

AWS Rekognition

Age 36-44
Gender Male, 73.3%
Calm 66.1%
Surprised 9.4%
Happy 9.1%
Confused 6.6%
Angry 3.9%
Sad 2.6%
Disgusted 1.2%
Fear 1.1%

AWS Rekognition

Age 48-56
Gender Male, 97.7%
Calm 47.4%
Happy 36%
Sad 6.9%
Confused 2.9%
Surprised 1.9%
Fear 1.8%
Disgusted 1.7%
Angry 1.4%

AWS Rekognition

Age 27-37
Gender Male, 95.7%
Sad 94.9%
Calm 3.1%
Disgusted 1%
Fear 0.4%
Confused 0.2%
Angry 0.1%
Happy 0.1%
Surprised 0.1%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 89.5%
Confused 3.8%
Sad 2.5%
Surprised 1.6%
Happy 1.6%
Disgusted 0.5%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 43-51
Gender Male, 98.6%
Sad 57.8%
Calm 39.9%
Confused 1.1%
Happy 0.4%
Disgusted 0.3%
Angry 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Female, 89.9%
Happy 37.1%
Sad 25.8%
Confused 13.9%
Calm 8.1%
Disgusted 5.9%
Fear 3.3%
Surprised 3%
Angry 2.9%

AWS Rekognition

Age 41-49
Gender Female, 76.2%
Happy 82.8%
Calm 8.8%
Fear 3.3%
Sad 2.6%
Surprised 0.8%
Disgusted 0.7%
Angry 0.6%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Shoe
Person 99.2%
Person 99.1%
Person 99.1%
Person 99.1%
Person 99%
Person 97.8%
Person 95.7%
Person 95.2%
Person 81.6%
Person 65.2%
Person 61.8%
Chair 98.2%
Chair 97.9%
Chair 54.8%
Shoe 73.1%

Categories

Text analysis

Amazon

MJ17--YT37A--

Google

MJI7-- Y T37A°2- - XAGOX
MJI7--
Y
T37A°2-
-
XAGOX