Human Generated Data

Title

Untitled (Red Cross workers with blood donors, Philadelphia, PA)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10587

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Red Cross workers with blood donors, Philadelphia, PA)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10587

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.2
Human 99.2
Person 99.2
Person 99.2
Person 99.2
Person 98.7
Person 98.5
Person 98.4
Person 94
Chair 90.6
Furniture 90.6
Person 89.7
Interior Design 88.1
Indoors 88.1
Room 79.4
Floor 76.5
Flooring 73.9
Clothing 71.9
Apparel 71.9
Screen 71.6
Electronics 71.6
Monitor 69.3
Display 69.3
Sitting 66.8
LCD Screen 65.8
Face 61.1
Poster 60.4
Advertisement 60.4
Collage 58.3
Table 56.1

Clarifai
created on 2023-10-25

people 99.8
group 99.3
adult 98
woman 97.5
group together 96.9
man 96.4
music 96.1
actor 95.7
room 95.5
furniture 94.5
musician 93.7
indoors 93.2
education 92.9
actress 91.8
piano 90.7
chair 90.5
outfit 89.3
leader 88.8
administration 88.7
wear 88.6

Imagga
created on 2022-01-09

man 37
person 34.2
office 32
male 30.5
teacher 28.6
room 28.3
chair 27.9
people 27.3
business 26.1
adult 25.5
sitting 24
men 24
indoors 23.7
computer 23.5
classroom 23.2
businessman 22.9
professional 22.1
barbershop 21
laptop 19.9
home 19.1
shop 19
women 17.4
group 16.9
modern 16.8
communication 16.8
work 16.5
meeting 16
hairdresser 15.8
board 15.4
television 15.1
job 15
executive 14.8
table 14.8
indoor 14.6
smiling 14.5
lifestyle 14.5
occupation 13.7
desk 13.3
interior 13.3
working 13.3
happy 13.2
mature 13
mercantile establishment 12.8
class 12.5
handsome 12.5
senior 12.2
education 12.1
corporate 12
casual 11.9
confident 11.8
educator 11.8
team 11.6
furniture 11.6
salon 11.5
looking 11.2
back 11
worker 10.8
blackboard 10.7
smile 10.7
together 10.5
talking 10.5
couple 10.4
career 10.4
businesswoman 10
employee 9.8
school 9.8
students 9.7
student 9.7
technology 9.6
patient 9.6
screen 9.4
phone 9.2
telecommunication system 9.2
house 9.2
to 8.9
success 8.8
conference 8.8
place of business 8.6
sofa 8.6
businesspeople 8.5
portrait 8.4
study 8.4
manager 8.4
standing 7.8
color 7.8
studying 7.7
entrepreneur 7.7
exam 7.7
health 7.6
relax 7.6
learning 7.5
seat 7.4
equipment 7.3
cheerful 7.3
monitor 7.3
alone 7.3
girls 7.3
suit 7.2
case 7.2

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 96.1
person 94.3
black and white 93.2
clothing 90.2
indoor 87.5
man 76.9
furniture 53.3
computer 47.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 99%
Sad 40.2%
Calm 16.3%
Disgusted 15.7%
Surprised 14.2%
Confused 7.4%
Happy 2.7%
Angry 2.1%
Fear 1.4%

AWS Rekognition

Age 31-41
Gender Male, 99.8%
Sad 39.4%
Calm 26.6%
Happy 23.8%
Confused 3.2%
Angry 2.4%
Surprised 1.7%
Disgusted 1.7%
Fear 1.3%

AWS Rekognition

Age 42-50
Gender Female, 63%
Calm 91.8%
Happy 5%
Sad 1%
Disgusted 0.7%
Fear 0.5%
Surprised 0.5%
Confused 0.4%
Angry 0.2%

AWS Rekognition

Age 33-41
Gender Female, 86.4%
Happy 95%
Surprised 2%
Calm 1%
Fear 0.9%
Sad 0.6%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 45-51
Gender Female, 96.8%
Calm 96.8%
Sad 1.3%
Confused 0.6%
Surprised 0.4%
Happy 0.3%
Disgusted 0.3%
Fear 0.2%
Angry 0.1%

AWS Rekognition

Age 39-47
Gender Female, 98.9%
Calm 66.4%
Happy 24.1%
Sad 5.5%
Surprised 2.5%
Confused 0.4%
Disgusted 0.4%
Fear 0.4%
Angry 0.3%

AWS Rekognition

Age 41-49
Gender Male, 99.1%
Sad 37.8%
Calm 34.8%
Happy 13.7%
Surprised 5.3%
Angry 2.9%
Confused 2.2%
Fear 2.2%
Disgusted 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Text analysis

Amazon

20999
20977

Google

20977.
20977.