Human Generated Data

Title

Untitled (boys working in woodshop)

Date

1949

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19354

Human Generated Data

Title

Untitled (boys working in woodshop)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19354

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.8
Human 99.8
Person 98.9
Person 98.8
Person 98.5
Person 98
Person 90.3
Chair 89.7
Furniture 89.7
Person 86.1
Person 81.6
Machine 73.9
Person 70.8
Carpenter 68.2
Clinic 67.1
Building 66.4
Tripod 60.6
Workshop 60.2
Photography 60.2
Photo 60.2
People 57.9
Factory 57.5
Person 49.9

Clarifai
created on 2023-10-22

people 99.9
group 99.5
group together 99.5
adult 99.4
furniture 97.7
man 97.4
woman 97.2
room 96.2
several 95.8
five 95.8
four 94.9
education 94.6
three 94.6
desk 94.3
monochrome 94.3
two 94.3
administration 94
employee 93.8
school 93.4
many 92.5

Imagga
created on 2022-03-05

barbershop 100
shop 100
mercantile establishment 83.2
place of business 55.6
interior 30.1
chair 28.7
establishment 27.8
people 26.8
man 26.2
indoors 24.6
room 23.8
table 21.6
business 21.3
restaurant 20
men 18
modern 16.8
furniture 16.3
office 16.3
inside 15.6
indoor 15.5
male 14.2
women 13.4
person 12.5
lifestyle 12.3
floor 12.1
life 11.8
building 11.7
hairdresser 11.6
group 11.3
salon 11.2
work 11
architecture 10.9
light 10.7
adult 10.5
window 10.2
casual 10.2
glass 10.1
dinner 10.1
equipment 9.8
urban 9.6
home 9.6
dining 9.5
seat 9.4
meeting 9.4
decoration 9.4
elegance 9.2
design 9
decor 8.8
businessman 8.8
comfortable 8.6
corporate 8.6
industry 8.5
communication 8.4
house 8.4
coffee 8.3
bar 8.3
occupation 8.2
worker 8.1
team 8.1
working 8
cafeteria 7.9
food 7.9
chairs 7.8
barber chair 7.8
travel 7.7
hotel 7.6
two 7.6
desk 7.6
fashion 7.5
happy 7.5
mature 7.4
style 7.4
teamwork 7.4
smiling 7.2
handsome 7.1
family 7.1
job 7.1
hall 7
professional 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 97.9
clothing 94.2
person 93.3
man 72.6
black and white 54.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 97.3%
Calm 98.5%
Happy 0.6%
Surprised 0.3%
Confused 0.2%
Disgusted 0.2%
Sad 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 41-49
Gender Male, 72.7%
Calm 99.2%
Sad 0.7%
Happy 0%
Disgusted 0%
Surprised 0%
Confused 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 47-53
Gender Male, 92.3%
Calm 90.5%
Sad 8%
Happy 0.4%
Confused 0.3%
Fear 0.3%
Disgusted 0.2%
Angry 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 99.8%
Person 98.9%
Person 98.8%
Person 98.5%
Person 98%
Person 90.3%
Person 86.1%
Person 81.6%
Person 70.8%
Person 49.9%
Chair 89.7%

Categories

Text analysis

Amazon

KODAK-SL

Google

YT37A2 -XAO
YT37A2
-XAO