Human Generated Data

Title

Untitled (group of men sitting around table looking at blueprint plan)

Date

1954

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15127

Human Generated Data

Title

Untitled (group of men sitting around table looking at blueprint plan)

People

Artist: Jack Gould, American

Date

1954

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Person 99.4
Person 99
Person 98.9
Person 98.6
Person 96.9
Sitting 96.2
Person 96
Room 94.9
Indoors 94.9
Person 94.7
Person 91.9
Person 87.7
Meeting Room 83.5
Conference Room 83.5
Furniture 81.7
Table 75.9
Classroom 73.4
School 73.4
Electronics 72.3
Screen 72.3
Display 65.8
Monitor 65.8
Desk 62.7
LCD Screen 62.5
Restaurant 60
Cafeteria 60
Computer 59.2
Pc 59.2
Finger 57.5
Flooring 55.5
Office 55.5

Imagga
created on 2022-03-05

office 47.1
computer 43.3
people 37.4
business 37.1
desk 36.1
man 35.8
working 34.5
adult 34
work 33
laptop 31.9
professional 31.3
person 30.8
male 27.7
businessman 27.4
table 27.3
corporate 26.6
businesswoman 26.4
monitor 25.5
sitting 24.9
meeting 24.5
executive 23.5
indoors 22.9
job 22.1
team 21.5
group 21
businesspeople 20.9
worker 20.5
happy 20.1
technology 20
workplace 19.1
smiling 18.8
teamwork 18.6
keyboard 17.8
successful 16.5
smile 16.4
women 15.8
education 15.6
career 15.1
manager 14.9
confident 14.6
center 14.4
room 14.4
occupation 13.8
men 13.7
looking 13.6
portrait 13.6
home 13.6
communication 13.4
patient 13.3
success 12.9
20s 12.8
colleagues 12.6
screen 12.6
partner 12.6
specialist 12.3
senior 12.2
indoor 11.9
30s 11.5
teacher 11.4
attractive 11.2
presentation 11.2
document 11.1
casual 11
suit 11
coworkers 10.8
hospital 10.8
notebook 10.7
paper 10.3
board 10
partners 9.7
equipment 9.6
staff 9.6
talking 9.5
color 9.5
pen 9.4
classroom 9.4
showing 9.4
mature 9.3
phone 9.2
hand 9.1
newspaper 9
associates 8.9
medical 8.8
colleague 8.8
conference 8.8
consultant 8.8
shop 8.7
cooperation 8.7
employee 8.7
project 8.7
collar 8.6
engineer 8.5
two 8.5
horizontal 8.4
house 8.4
clinic 8.2
electronic equipment 8.1
cheerful 8.1
to 8
lifestyle 8
debate 7.9
together 7.9
collaboration 7.9
secretary 7.8
student 7.8
corporation 7.7
coat 7.7
illness 7.6
adults 7.6
handsome 7.1
interior 7.1

Google
created on 2022-03-05

Style 83.8
Black-and-white 83
Font 79.3
Monochrome photography 71.4
Event 69.2
T-shirt 69
Monochrome 68.7
Suit 68.2
Room 66.9
Desk 64.3
Stock photography 63.9
History 63
Sitting 62.7
Art 62.4
Table 62.4
Team 61
Photo caption 59.9
Class 59
Collaboration 58.1
Service 56.8

Microsoft
created on 2022-03-05

text 98.5
clothing 95.3
man 94.7
person 90
table 88.6
window 82.5
laptop 51.8

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Male, 99.6%
Calm 96.1%
Sad 2.6%
Confused 0.8%
Angry 0.2%
Happy 0.1%
Disgusted 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 47-53
Gender Male, 97.9%
Calm 97%
Sad 2.3%
Surprised 0.2%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 47-53
Gender Male, 99.9%
Sad 59.4%
Calm 37.2%
Surprised 1%
Disgusted 0.7%
Angry 0.6%
Happy 0.5%
Fear 0.4%
Confused 0.3%

AWS Rekognition

Age 27-37
Gender Male, 85.7%
Sad 91.6%
Calm 7.5%
Confused 0.5%
Surprised 0.1%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%
Happy 0%

AWS Rekognition

Age 23-33
Gender Male, 96.8%
Calm 99.6%
Surprised 0.1%
Confused 0.1%
Sad 0.1%
Happy 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 47-53
Gender Female, 62.1%
Calm 93.9%
Sad 5.3%
Surprised 0.5%
Disgusted 0.1%
Happy 0.1%
Fear 0.1%
Confused 0.1%
Angry 0%

AWS Rekognition

Age 35-43
Gender Female, 59.7%
Calm 98%
Sad 0.7%
Disgusted 0.3%
Surprised 0.3%
Angry 0.3%
Happy 0.2%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 49-57
Gender Male, 98.7%
Calm 79.7%
Sad 8.1%
Angry 5.1%
Disgusted 2%
Confused 1.6%
Surprised 1.4%
Fear 1.4%
Happy 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a group of people standing next to a window 54.3%
a group of people standing in front of a window 52.8%
a group of people in a room 52.7%

Text analysis

Amazon

MISSOURIAN

Google

RIAN
RIAN