Human Generated Data

Title

Untitled (men and woman seated behind office table)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16724

Human Generated Data

Title

Untitled (men and woman seated behind office table)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16724

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Classroom 99.5
Room 99.5
School 99.5
Indoors 99.5
Person 99.4
Human 99.4
Person 99
Person 99
Person 98.9
Person 98.9
Person 98.1
Person 97.5
Tie 97.4
Accessories 97.4
Accessory 97.4
Clock Tower 87.5
Building 87.5
Architecture 87.5
Tower 87.5
Audience 83
Crowd 83
Furniture 65.2
Speech 65
Sitting 60.9
Lecture 56.5
Seminar 56.3

Clarifai
created on 2023-10-29

people 99.6
group 98.1
room 97.6
administration 96.7
desk 96.2
adult 94.1
leader 94.1
furniture 93.9
woman 93.4
man 92.7
chair 92.1
monochrome 91.8
group together 90.4
meeting 89.4
education 87.4
league 86.5
table 85.2
indoors 82.3
many 81.8
several 80.7

Imagga
created on 2022-02-26

barbershop 78.1
shop 67
mercantile establishment 48.7
man 38.6
room 35.8
office 34
place of business 32.7
people 31.2
male 27.7
table 27.5
person 27
indoors 26.4
interior 23.9
work 22.8
computer 22.5
desk 21.8
adult 21.5
professional 21.5
business 21.3
home 17.6
working 16.8
businessman 16.8
chair 16.4
establishment 16.3
sitting 16.3
hospital 16
happy 15.7
job 15
classroom 14.7
meeting 14.1
medical 14.1
furniture 14.1
occupation 13.8
corporate 13.8
indoor 13.7
worker 13.4
modern 13.3
laptop 13.2
smiling 13
lifestyle 13
executive 13
men 12.9
clinic 12.9
patient 12.7
women 12.7
businesspeople 12.3
businesswoman 11.8
horizontal 11.7
technology 11.1
inside 11
communication 10.9
teacher 10.9
restaurant 10.9
salon 10.8
holding 10.7
smile 10.7
group 10.5
two 10.2
team 9.9
monitor 9.7
portrait 9.7
senior 9.4
equipment 9.2
phone 9.2
kitchen 9.2
hand 9.1
suit 9
talking 8.6
keyboard 8.5
screen 8.4
attractive 8.4
mature 8.4
house 8.4
health 8.3
confident 8.2
board 8.1
light 8
looking 8
employee 7.8
education 7.8
assistant 7.8
check 7.7
exam 7.7
hairdresser 7.7
workplace 7.6
doctor 7.5
to 7.1
medicine 7
together 7
glass 7

Google
created on 2022-02-26

Window 92.3
Building 91
Black 89.6
Black-and-white 87.6
Picture frame 86.6
Style 84.2
Art 79.8
Chair 79.6
Monochrome 78.3
Font 77.5
Monochrome photography 77
Shelf 74.1
Bookcase 73.1
Table 72.5
Room 71.5
Shelving 66.2
Flooring 64.1
Stock photography 62.6
History 61.3
Visual arts 61

Microsoft
created on 2022-02-26

text 99.5
indoor 94
clock 89.6
table 82.1
house 61.6
furniture 55.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 96.1%
Sad 80.1%
Calm 11.2%
Confused 2.6%
Disgusted 2.3%
Happy 2.3%
Angry 0.7%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Calm 72.4%
Surprised 7.9%
Confused 7.6%
Happy 6.5%
Sad 2.6%
Disgusted 1.7%
Angry 0.7%
Fear 0.6%

AWS Rekognition

Age 50-58
Gender Male, 100%
Sad 50.4%
Calm 16.3%
Angry 9.5%
Surprised 7.9%
Confused 5.2%
Fear 4.3%
Happy 4%
Disgusted 2.5%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 99.8%
Surprised 0.1%
Happy 0.1%
Confused 0%
Sad 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 40-48
Gender Male, 97.5%
Calm 59%
Sad 19.5%
Happy 9.8%
Confused 3.9%
Fear 3%
Disgusted 2.1%
Surprised 1.6%
Angry 1.1%

AWS Rekognition

Age 31-41
Gender Male, 95.2%
Calm 61.5%
Happy 24.7%
Surprised 7.5%
Disgusted 1.8%
Confused 1.5%
Sad 1.3%
Angry 1%
Fear 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Clock Tower
Person 99.4%
Person 99%
Person 99%
Person 98.9%
Person 98.9%
Person 98.1%
Person 97.5%
Tie 97.4%
Clock Tower 87.5%

Categories

Imagga

interior objects 100%

Text analysis

Amazon

KODVROVEELA
THE
Wayne
BALANA