Human Generated Data

Title

Untitled (three male students working on wood structure in industrial arts classroom)

Date

1952

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9396

Human Generated Data

Title

Untitled (three male students working on wood structure in industrial arts classroom)

People

Artist: Martin Schweig, American 20th century

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9396

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99
Human 99
Person 98.4
Person 95.6
Building 72.6
Factory 59.1
Flooring 57.2
Drawing 56.1
Art 56.1
Floor 55.1

Clarifai
created on 2023-10-27

people 99.7
monochrome 99.4
adult 97.9
woman 96.7
man 96.6
two 96
group 95.1
furniture 93
industry 92.4
three 91.8
indoors 91.1
grinder 90.3
group together 87.8
several 85
administration 84.9
dig 84.7
vehicle 84.2
transportation system 83.8
child 82.8
street 81.1

Imagga
created on 2022-01-23

shop 32.9
bakery 30.2
restaurant 25.5
interior 23.9
room 22.3
man 21.5
cafeteria 21.2
mercantile establishment 20.6
building 18.6
working 18.5
people 18.4
table 18.3
business 18.2
kitchen 18.1
modern 16.8
indoors 15.8
place of business 14.7
work 14.2
equipment 14
lifestyle 13
home 12.8
office 12.4
hospital 12.4
hotel 12.4
chair 12.3
computer 12.1
male 12
women 11.8
furniture 11.4
technology 11.1
structure 11.1
person 10.7
happy 10.6
adult 10
professional 9.9
food 9.5
laptop 9.3
display 9.3
design 9
architecture 8.7
smiling 8.7
sitting 8.6
men 8.6
stove 8.6
glass 8.5
industry 8.5
floor 8.4
holding 8.2
occupation 8.2
indoor 8.2
light 8
counter 8
job 8
medical 7.9
3d 7.7
corporate 7.7
wall 7.7
dining 7.6
window 7.5
house 7.5
establishment 7.5
service 7.4
inside 7.4
hall 7.2
team 7.2
information 7.1
decor 7.1
medicine 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

man 93.3
text 93
person 91.2
black and white 88
indoor 85.7
clothing 83
preparing 52.6
cooking 29.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Female, 89.6%
Calm 100%
Sad 0%
Confused 0%
Disgusted 0%
Surprised 0%
Happy 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 43-51
Gender Female, 91.4%
Calm 93.2%
Sad 4.6%
Confused 0.7%
Fear 0.4%
Disgusted 0.3%
Happy 0.3%
Angry 0.3%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Text analysis

Amazon

KODVK-EVEELA