Human Generated Data

Title

Untitled (two men and a woman in store)

Date

1956

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18194

Human Generated Data

Title

Untitled (two men and a woman in store)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18194

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.7
Human 99.7
Person 98.7
Person 98.1
Table Lamp 93.1
Lamp 93.1
Furniture 75.5
Shop 74.5
Person 65.8
Indoors 62.2
Shelf 59.5
Clothing 59
Apparel 59

Clarifai
created on 2023-10-22

people 99.7
many 98.7
monochrome 97.6
stock 97.2
group 97.1
furniture 97
man 95.1
room 93.3
wear 93.3
commerce 93
street 92.7
group together 92.1
indoors 90.9
adult 90.5
one 89.2
two 87
outfit 84.9
shopping 84.8
department store 84.3
woman 83

Imagga
created on 2022-03-04

barbershop 59.1
shop 51.4
blackboard 41.6
mercantile establishment 36.4
man 26.9
people 25.7
place of business 24.3
chair 22.9
male 22
interior 21.2
room 20.7
person 19.9
work 18.8
business 18.8
barber chair 18.7
office 18.1
technology 17.1
indoors 16.7
center 15.6
computer 15.2
furniture 14.6
seat 14.1
adult 13.9
home 13.6
modern 13.3
lifestyle 13
house 12.5
businessman 12.4
education 12.1
establishment 12.1
men 12
indoor 11.9
kitchen 11.7
hand 11.4
classroom 11.1
inside 11
board 10.8
worker 10.7
working 10.6
teacher 10.6
occupation 10.1
horizontal 10
equipment 10
happy 10
professional 10
corporate 9.4
table 9.4
senior 9.4
casual 9.3
portrait 9.1
design 9
school 9
restaurant 8.9
job 8.8
smiling 8.7
hospital 8.6
college 8.5
floor 8.4
holding 8.3
monitor 8.1
looking 8
day 7.8
happiness 7.8
architecture 7.8
university 7.8
patient 7.7
class 7.7
industry 7.7
city 7.5
clinic 7.4
student 7.4
dishwasher 7.4
light 7.4
food 7.2
women 7.1
counter 7.1
medical 7.1
look 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 99.5
black and white 85.7
person 84.6
white 75.6
black 69.5
clothing 66
furniture 56.3
shop 25.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 94.1%
Calm 99.3%
Sad 0.3%
Happy 0.1%
Confused 0.1%
Surprised 0.1%
Fear 0.1%
Disgusted 0.1%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.7%
Person 98.7%
Person 98.1%
Person 65.8%

Categories

Text analysis

Amazon

MJI7--YT37A°--A

Google

00 771477 24
00
771477
24