Human Generated Data

Title

Untitled (advertising executives at meeting around table)

Date

1957

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20107

Human Generated Data

Title

Untitled (advertising executives at meeting around table)

People

Artist: Peter James Studio, American

Date

1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20107

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.2
Human 99.2
Person 98.9
Person 98.8
Person 97
Person 96.4
Person 96.2
Chair 93.4
Furniture 93.4
Electronics 75.7
Clothing 73.1
Apparel 73.1
Studio 70.5
Sitting 67.4
Monitor 60.7
Display 60.7
Screen 60.7
Table 58
Person 49.1

Clarifai
created on 2023-10-22

people 99.5
group 99.1
adult 98.8
woman 98.3
group together 97
desk 96.8
man 95.5
sit 95.3
furniture 95
room 93.1
education 92.9
administration 91.4
sitting 90.8
three 88.8
employee 88.5
several 88.4
leader 87.2
child 85.9
indoors 84.9
actor 84.5

Imagga
created on 2022-03-05

classroom 51.5
businessman 44.1
meeting 43.3
room 42.9
office 42.1
business 41.3
group 41.1
man 41
male 39
people 38.5
team 37.6
businesswoman 37.3
person 33.3
table 31.6
teamwork 29.7
adult 28.7
businesspeople 28.5
men 28.3
executive 27.8
laptop 27.8
professional 27.7
corporate 26.6
happy 25.1
colleagues 24.3
sitting 24
together 23.7
work 23.6
desk 23.6
computer 21.9
talking 21.9
women 21.4
job 21.2
working 21.2
communication 21
smiling 19.5
successful 18.3
workplace 18.1
manager 17.7
indoors 17.6
worker 17.3
teacher 17.1
employee 17
smile 16.4
confident 16.4
associates 15.7
discussion 15.6
indoor 15.5
partners 14.6
cooperation 14.5
success 14.5
cheerful 13.8
coworkers 13.8
discussing 13.7
lifestyle 13.7
company 13
looking 12.8
suit 12.7
conference 12.7
modern 12.6
student 12.6
corporation 12.5
happiness 12.5
collaboration 11.8
conversation 11.7
handsome 11.6
30s 11.5
boss 11.5
career 11.4
education 11.3
senior 11.2
presentation 11.2
mature 11.2
casual 11
30 35 years 10.8
class 10.6
partnership 10.6
coffee 10.2
businessmen 9.7
staff 9.7
hall 9.7
busy 9.6
leader 9.6
planning 9.6
collar 9.6
learning 9.4
document 9.3
chair 9.2
technology 8.9
explaining 8.9
consultant 8.8
couple 8.7
employment 8.7
center 8.5
females 8.5
friends 8.5
brass 8.4
friendship 8.4
camera 8.4
color 8.3
nurse 8.2
new 8.1
interaction 7.9
chatting 7.9
diverse 7.8
students 7.8
portrait 7.8
expression 7.7
four 7.7
finance 7.6
horizontal 7.5
contemporary 7.5
showing 7.5
study 7.5
wind instrument 7.5
board 7.3
occupation 7.3
interior 7.1
paper 7.1

Google
created on 2022-03-05

Window 85.5
Black-and-white 85.2
Style 83.9
Building 75.2
Technology 74.9
Monochrome 74.2
Monochrome photography 74.1
Music 69.4
Table 69.3
Room 69.1
Chair 68.1
Audio equipment 67.1
Event 67
Curtain 65.5
Conversation 65.2
Sitting 64.5
Suit 61.5
Desk 57.3
Font 54.4
Eyewear 51.4

Microsoft
created on 2022-03-05

person 95.1
laptop 93.5
text 93.2
indoor 91.2
computer 87.1
clothing 80
people 70.8
man 61.9
black and white 54.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Male, 98.7%
Calm 98.8%
Angry 0.3%
Sad 0.2%
Surprised 0.2%
Happy 0.2%
Disgusted 0.2%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 49-57
Gender Male, 99.9%
Calm 51.2%
Sad 45%
Surprised 2.6%
Confused 0.8%
Angry 0.2%
Happy 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Male, 64.9%
Calm 99.3%
Sad 0.3%
Surprised 0.2%
Confused 0.1%
Disgusted 0.1%
Angry 0%
Fear 0%
Happy 0%

AWS Rekognition

Age 45-53
Gender Male, 98.8%
Calm 75.4%
Sad 16.2%
Angry 5.6%
Disgusted 1%
Confused 0.8%
Surprised 0.6%
Fear 0.3%
Happy 0.2%

AWS Rekognition

Age 47-53
Gender Male, 99.2%
Calm 59.1%
Sad 38.9%
Confused 0.8%
Surprised 0.3%
Angry 0.2%
Disgusted 0.2%
Fear 0.2%
Happy 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 99.2%
Person 98.9%
Person 98.8%
Person 97%
Person 96.4%
Person 96.2%
Person 49.1%
Chair 93.4%

Categories

Text analysis

Amazon

TALES
TOMOR
O
ean
VT37A2

Google

TALES ON TOMOR
TALES
ON
TOMOR