Human Generated Data

Title

Three Men

Date

1939

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Louis Agassiz Shaw Bequest and the Richard Norton Memorial Fund, 1997.20

Copyright

© Estate of Ben Shahn / Artists Rights Society (ARS), New York

Human Generated Data

Title

Three Men

People

Artist: Ben Shahn, American 1898 - 1969

Date

1939

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Louis Agassiz Shaw Bequest and the Richard Norton Memorial Fund, 1997.20

Copyright

© Estate of Ben Shahn / Artists Rights Society (ARS), New York

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Human 99.6
Person 99.6
Person 99.6
Person 98.8
Art 98.5
Painting 94.5
Accessory 73.5
Accessories 73.5
Tie 73.5
Mural 55.6

Clarifai
created on 2018-02-09

people 99.8
religion 99.5
man 99.3
adult 99.2
painting 98.7
one 98.6
art 95.9
portrait 94.9
god 94.4
wear 93.6
two 92.9
saint 92.9
prayer 91.3
facial hair 90.4
spirituality 90.3
veil 88.4
book series 88.2
church 86.6
book 85.3
lid 84.7

Imagga
created on 2018-02-09

ruler 51.7
art 26.4
old 21.6
sculpture 20.8
culture 16.2
close 15.4
carving 15.2
money 14.5
religion 14.3
portrait 13.6
puzzle 13.5
currency 12.6
vintage 12.4
ancient 12.1
cash 11.9
man 11.4
face 11.4
travel 11.3
texture 11.1
antique 10.4
black 10.2
dollar 10.2
church 10.2
architecture 10.1
wall 9.9
bank 9.8
brown 9.6
paper 9.4
religious 9.4
grunge 9.4
temple 9.2
painter 9.2
letter 9.2
statue 9.1
masterpiece 8.9
detail 8.8
symbol 8.7
building 8.7
god 8.6
game 8.6
finance 8.4
banking 8.3
gold 8.2
aged 8.1
business 7.9
male 7.8
museum 7.8
pray 7.7
golden 7.7
dollars 7.7
holy 7.7
saint 7.7
spirituality 7.7
spiritual 7.7
bill 7.6
head 7.6
one 7.5
famous 7.4
paint 7.2
wealth 7.2
history 7.1
financial 7.1
icon 7.1

Google
created on 2018-02-09

painting 87.5
art 83.5
portrait 69.2
paint 61.4
modern art 61
mural 60.7
artwork 54.7

Microsoft
created on 2018-02-09

person 98.2
painting 21.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Male, 98.3%
Confused 4.6%
Sad 8.3%
Disgusted 0.6%
Happy 2.3%
Angry 10.7%
Surprised 1.7%
Calm 71.9%

AWS Rekognition

Age 35-52
Gender Male, 99.7%
Disgusted 7.5%
Happy 1.4%
Calm 72.5%
Sad 4.2%
Angry 3.8%
Surprised 4%
Confused 6.7%

AWS Rekognition

Age 38-59
Gender Male, 92.9%
Sad 7.1%
Disgusted 1%
Surprised 1.4%
Calm 83.7%
Angry 2.5%
Happy 0.8%
Confused 3.5%

AWS Rekognition

Age 26-43
Gender Female, 50.7%
Calm 50%
Sad 47.6%
Happy 46%
Angry 45.7%
Surprised 45.3%
Disgusted 45.2%
Confused 45.2%

AWS Rekognition

Age 26-43
Gender Male, 51.7%
Happy 46.3%
Calm 45.7%
Confused 45.2%
Sad 49.9%
Surprised 45.4%
Angry 47.1%
Disgusted 45.4%

Microsoft Cognitive Services

Age 79
Gender Male

Microsoft Cognitive Services

Age 48
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Painting 94.5%
Tie 73.5%

Captions

Azure OpenAI

Created on 2024-01-27

This image depicts a painting divided into multiple sections with varying scenes and perspectives. The left side of the painting shows an interior scene with an individual in a white shirt. On the right side of the painting, there's a prominent green pillar. Behind this centerpiece, an urban streetscape with buildings, signs, and individuals in the background is visible. The color palette is muted with grey, green, and earth tones dominating the scene. The style carries a sense of realism with attention to detail and shading that gives depth to the figures and structures. The overall composition merges the architectural elements with the human figures, suggesting a narrative or social commentary.

Text analysis

Amazon

FTTUO
C2AI